Ron Barnette's Zeno's Coffeehouse Challenge #64 Result


Over 150 Zeno's Coffeehouse patrons dropped by the Coffeehouse and responded to theis latest Challenge! But we want to engage more visitors who participate. I've included below the original challenge, followed by examples of thoughtful replies.

Thanks for your support of Zeno's Coffeehouse! 

Ron Barnette

Conflicting Belief-Detector??


One night recently, Charles and Maggie encouraged their Zeno's patrons to "think out of the box," in hopes to generate a fun thought experiment for a new Zeno's Challenge, as they enjoyed a fruitful evening of conversation and debate...so important at the Coffeehouse. Chris, a loyal and intelligent Coffeehouse patron---and who has a keen interest in the philosophy of mind, and in legal theory, as was learned ---was quick to respond, as he had been reading about a puzzling challenge which he wanted to relate for the evening's discussion. Here's Chris' example, which is the basis for our latest Zeno's Challenge:

Let us suppose that at some point in the future a complete theory of the
mind has been developed---one which explains and identifies all matters of belief and
emotion with electro-chemical properties of the brain, once one is analyzed accordingly. A computational device based on this theory has been built in order to detect all of one's beliefs objectively. In order to use this device, a human subject is to be connected to electrical sensors which map precisely the relevant brain states, and then watches a computer monitor as information is displayed. A statement will flash on the screen, and the subject will be asked to think
about the statement and decide whether he or she believes the sentence to be true.

If the device detects that the subject's brain is in a state of belief, then it signals this by sounding an alarm. If the device detects disbelief, then it does not sound an alarm.

Now imagine this: you are connected up to such a ' belief detector' and that you are concentrating dutifully on the monitor. The following statement appears on the monitor:

"The alarm will not sound."

Would you believe it, or not?

If you believe it, then the belief detector would cause the alarm
to sound, making the sentence false. So, to avoid believing anything known to be
false, you can't believe the sentence. But if you don't believe it,
then the alarm will never ring, and the sentence will be true. Since
you can reason that it must be true, doesn't that mean that you would
believe it? 

Yikes! What about Chris' challenge? Given the situation, what's a reasonable person to do...embrace a contradiction? I don't think so! What would YOU do?

Help with your reflective comments here, please! The Coffeehouse wants to know your thoughts.

Most appreciative as we recieve your comments and argument, 

Ron Barnette

--------------------------------------------------------------------------------------------------

Sample replies are published below, for your reflection and review. Thanks to all respondents for their thoughtful ideas....RB

From Michelle in Florida, USA: comments: People hold false beliefs all the time. The machine is merely indicating whether it is true that someone holds the belief. It doesn't measure the truth of the belief itself.

Now if we had the sentence "this statement is false" that would be the debate I think he's looking for...

From Mark Young in Canada: comments: I'd probably jump out of my skin when the alarm went off (just after reasoning out it's gotta be true :-).

But there's no law that says you have to believe or disbelieve every sentence that passes in front of your eyes.  Just sit there thinking "Well, only time will tell!"  When the words disappear from the screen, you can accept that they were true without setting off the alarm. 

It's important not to think of it as a competition, tho.  If you sit there thinking "I can make it true, just by suspending my belief long enuf..." and watching the timer tick down to zero, there's bound to come a time (just before the clock runs out) when you think "I've done it!" -- at which point you lose! 

In any case, there's no need to accept a contradiction.  You just have to get comfortable with the fact that you are going to get things wrong from time to time.  You've just been coerced into a game of "Heads, I win; tails, you lose" by the guy running the machine.  You can only put off your defeat by getting rid of the coin. 

From Justen Kirchner in Alabama, USA: comments: The alarm will always sound. (disregarding emotionally dejecting questions) The mind's binary operator automatically divides things into opposites, which leads to an a priori choice - Do I want the system to succeed or fail. Instantaneously, the mind would go through the conditions for being true and false in order to choose which one fits best, hence, triggering the buzzer every time.

From Damian Tetkowski in New York, USA: comments: you are trapped in a paradox. Zeno is challenging the liars paradox. you can put it as "the alarm will always sound" then you believe it and it will sound. but one who challenges it, has disbelieve and it will not sound, despite it saying it will always sound. even if a truth statement were given it would not change then quality of the statement. saying "it is true that the alarm will sound" can be the same as "the alarm will sound"

From Amber Griffleon in Iowa, USA: comments: Well, unless direct doxastic voluntarism is true, such that one can CHOOSE at any moment to believe what they want/think is rational/etc., I wouldn't "DO" anything in any relevant sense of the word. My guess is that the alarm would not sound, given that the machine would very likely not find in me a belief one way or the other about the truth of the above prompt. Besides, any beliefs formed about the above prompt would likely take the form of counterfactuals (e.g., "If I believed p, the alarm would sound, making p false" and "If I were not to believe p, the alarm would not sound, making p true"), and these beliefs are not equivalent to believing or disbelieving p. On the other hand, people appear to believe falsehoods all the time for pragmatic reasons, so if I could convince myself (via self-deceptive means) to disbelieve p, I might be able to get the result I want (namely that the alarm does not sound).

From Miles Kennedy in Ireland: comments: All I have to do is BELIEVE that I don't believe the statement thus causing the alarm to sound and rendering me correct in my belief in my disbelief, classically self-reflexive thought.

From Matthew Baker in England: comments: I imagine it'd flick on and off accordingly as the person's brain switched between the two impossible states. Having said that, I'm not sure anything approaching an 'objective' detection of a person's brain state could ever exist, so the apparent contradiction in the answer could be a result of the impossibility of the thought experiment?

From Brian Crabb in England: comments: I would say that the detector needs to be more clearly described. Is it set to sound only when a belief is present, or whenever disbelief is not present? There has to be some intial stipulation as to how it responds to uncertainty. Once that is stipulated, that is how it would respond in this instance. 

From Guido Moreira in New York, USA: comments: If you enter a state of belief about the statement "the alarm will not sound," the alarm will immediately sound, and your gullibility will be instantly laid bare. 
If you disbelieve it, you might feel so foolish as you sit there and wait in vain for the alarm, that you actually start to believe it won't sound.  This change of heart would, of course, lead to immediate heart-BREAK because the alarm would go off at that very instant.
However, if you devote yourself entirely to a stubborn disbelief in that demonic statement, you will never actually be wrong.  You can disconnect the electrodes and go home smiling smugly at the fact that the alarm always COULD sound in the future, and nobody could ever prove you wrong for disbelieving that it won't.  

Thanks for your thoughtful replies!!!!