Friday 7 October 2011

Friday Question: What should you do when you disagree?

Suppose you have just gone out to dinner with a good friend. Afterwards, the bill is presented and you and your friend (call him 'Chuck') both try to calculate 15% of the bill to leave as a tip. You finish calculating first and arrive at the belief that the tip should be £8.35. Chuck, shortly afterwards, tells you the tip should be £8.65. Suppose further that, prior to this disagreement, you and Chuck recognize each other to be of equal (and very good) mathematical ability and of all the same evidence on the issue at hand. You are, as philosophers in the epistemology of disagreement refer to it, epistemic peers on this particular matter.


What is the rational response to this fact of disagreement (with an epistemic peer?)


Consider two very different options:


(1) Hold your guns. The fact of disagreement with an epistemic peer does not rationally require you to revise your belief. You are epistemically justified in continuing to believe the tip should be £8.35 despite the fact that Chuck, your epistemic peer, disagrees with you.


(2) Split the difference. The fact of disagreement with an epistemic peer does rationally require you to revise your belief. In this case, given that you take it that you and Chuck are both equally likely to be right on the matter, you should give up your belief and withhold judgment about the amount of the tip.






7 comments:

  1. What about just believing whatever is, in fact, the right answer, i.e. whatever 15% of the bill really is?

    ReplyDelete
  2. Benjamin, let's suppose (as you suggest) my belief about what 15% of the bill is, is in fact correct. The philosophical problem of whether doxastic revision is required given that a recognized epistemic peer disagrees with me remains the same. Does the fact of disagreement (with an epistemic peer) rationally require me to revise my belief (perhaps by withholding judgment in this case) or am I rationally permitted to hold my guns in the face of this disagreement?

    Richard Feldman and Earl Conee would say that, even if you are right about the tip in this case, it is not rational for you to continue to believe that you are, given that an epistemic peer--who you antecedently take to be equally likely to be correct as you on this matter--disagrees with you.

    Though there is plenty of motivation for Feldman/Conee's response to this sort of case, there is (I think) one especially troubling implication of the view, which would be a sort of widespread agnosticism (particularly in subject matters where there is rampant disagreement).

    ReplyDelete
  3. Let's fill in the case a bit more. Suppose the bill is actually £56.27. Let's further assume that there is no disagreement about the bill, so there is no question about what the amount of the bill is. The question that we are trying to determine is what 15% of that amount is.

    My view is just that (conditional on the assumption about the amount of the bill) the evidence supports believing that the tip should be £8.44, so we should both just change our beliefs to that (which, incidentally is not splitting the difference).

    If the bill had been £55.67, then I would have been right and my friend would have been wrong. Then I should stick to my guns, and my friend should revise. If the bill had been £57.67, then he would be right and I would have been wrong. Then I should stick to my guns, and my friend should revise.

    In all cases, I fail to see the relevance of the disagreement. I think what matters is just what the amount of the bill is.

    ReplyDelete
  4. Hi Benjamin, it seems to me that you are not merely rejecting the Feldman/Conee preferred strategy for responding to cases of peer disagreement (and thus taking the hold-your-guns view) but that, even more, you are not recognizing this problem in the epistemology of disagreement as a genuine philosophical problem. If I understand you correctly here, you seem to be endorsing a position that could be summed up with the following dictum: "You are rationally required to believe whatever is true." If such a view were correct, then the alleged problem of disagreement would indeed disappear. However, the dictum the endorsement of which would carry with it a dissolving of the problem of disagreement is a strange one to endorse. Even if it is in some sense (epistemically) good to believe whatever is true, I surely cannot be rationally required to believe whatever is true, and this is especially so in cases where my evidence points to the contrary. Suppose, for instance, that I have ventured into a city, A-ville, that (for some bizarre reason) has disguised itself in such a way that its signs all say that it is B-ville. Given your evidence that you are in B-ville, the (disguised, and inaccessible to you) fact that you are in A-ville doesn't rationally require you to believe you are in A-ville. You can be perfectly rational in maintaining your belief that you are in B-ville.

    So the strong view that rationally requires you believe whatever is true seems implausible. That said, if we endorse the view that you are rationally required to believe in accordance with the evidence, then the problem of peer disagreement becomes a more obviously worrisome philosophical problem. After all, the fact that an epistemic peer disagrees with you on a particular matter plausibly constitutes in itself evidence that you are wrong on that matter. The question then becomes: to what extent (if at all) is doxastic revision rationally required in the face of peer disagreement?

    And here's where the Feldman/Conee view (conformism) looks rather sensible. If, in your revised bill case, I am right that the tip should be £8.44, we have to consider that my evidence both supports that the tip should be £8.44 AND that someone whom I take to be equally as likely right as I am has reached a different conclusion. Perhaps this rationally requires me to revise my belief, even though I happen to be right. For those who wish to endorse the 'hold-your-guns' view, there is a burden to explain how it is that not-revising your belief is rationally permissible despite the fact that you now accept that either you or someone you antecedently took to be equally likely to be right on the matter as you, is mistaken.

    ReplyDelete
  5. Hi jadamcarter,

    I guess I didn't mean to suggest that one was rational in believing whatever is true. My suggestion was more along the lines that your epistemic peer is providing no further (relevant) evidence above and beyond the evidence that you originally shared (which in this particular case appears to settle the truth of the matter). My view is just that there is some position or range of positions that it is maximally rational to take in light the original shared evidence. One should simply take one of those positions whether or not you have an epistemic peer that disagrees with you.

    ReplyDelete
  6. My friend Chris had this thought on my post:

    "... I think that I respectfully disagree"

    His underlying rationale:

    "The idea that whoever is 'right' should stick to their guns assumes an external perspective and authority to adjudicate on the 'right' answer. Which may be straightforward enough in a mathematics problem, but isn't the idea of the question to provoke a discussion on the best course of action when it cannot be proved that one side or the other is definitively, authoritatively correct?

    I guess from my perspective, I've seen so much damage done by churches who believe that they are authoritatively, exclusively 'right' and cannot conceive of the possibility of splitting the difference with those who are equally committed to a different point of view."

    You can find Chris here: http://thedockchurch.org/

    ReplyDelete
  7. Dear Chris, there is an assumption that--in the sort of disagreements that philosophers think of as genuine disagreements--the following holds: the world is a particular way, and our judgements about the world are true or false depending on how things really are. With this assumption in play, consider that two parties can (and often do) make contradictory claims about how things are. One utters 'such and such is the case' and the other utters 'such and such is not the case.' In such a disagreement, what determines who is right is simply how the world really is (as opposed to facts about how anyone thinks it is). So, in everyday disagreements in which incompatible claims are made, at least one party really will be wrong.

    So you are right insofar as you suggest that the debate presupposes that at least one perspective in a disagreement is correct. What is not supposed though is that any authority (or any authority's dictates) are themselves truthmakers of judgements. Authorities can disagree with each other, and when they do, who is right is determined not by anyone's judgements but simply by how things really are.

    Now that said, your comment hints to another very different and interesting problem: which is how to approach disagreements on topics that (unlike mathematics, for instance) are not easily provable. Are judgements about right and wrong provable? Here there is substantial debate among philosophers. Moral realists will point out that, even though moral judgements about which we might agree (i.e. 'Assisted suicide is wrong') aren't such that they can be proven in through the sort of empirical premises we appeal to in proving claims in natural science, they nonetheless can be true and can be proven to be such. Put simply: morality is an area where (a good number of philosophers) accept the sort of realist assumption that undergirds humdrum disagreements about everyday particular objects: the assumption is that, when two parties disagree in how they represent the way things are, what makes one have uttered truly and the other falsely is nothing other than how things really are. I qualify that there are plenty of philosophers do deny that moral disagreements should be pursued with the background assumption that moral claims are able to be true or false. It's just important to know that this is a live substantive matter on which there are plenty of philosophers who would treat moral disagreements no differently from empirical disagreements.

    On a final point: another background assumption that makes epistemic peer disagreement interesting in epistemology is that both sides seek to believe whatever is the most reasonable. And in cases of peer disagreement, it is a live question whether (in virtue of the fact of disagreeing with an epistemic peer) holding one's guns or splitting the difference is most rational. Sometimes, though (and I think this might be the case in some instances when churches are unwilling to split the difference, no matter what) this maxim 'believe what is most reasonable' is trumped by other aims, such as 'believing in accordance with a particular dogma.' Dogmatists might well have non-epistemic reasons for not splitting the difference. But the reasons that the philosophical problem I sketched originally is primarily concerned with are strictly reasons one has insofar as he/she is motivated to believe rationally (given the evidence).

    ReplyDelete