Here’s something interesting about making decisions – at the heart of the decisions we make, there’s some level of irrationality going on.
The best way to illustrate this is by using the example of Newcomb’s Paradox.
Imagine this scenario: There are two boxes – one transparent, and one opaque. You’re supposed to choose either (1) the opaque box OR (2) both boxes. The transparent box contains $1000. But we do not know how much money is inside the opaque box. Now, there’s this magical person known as The Predictor. The Predictor can predict with a 99.999999…% accuracy on which choice you’ll make. If he predicts that you’ll choose option (1), the opaque box, then he’ll put $1million in it, and you’ll get $1million. If he predicts that you’ll go with option (2), of taking both boxes, he’ll put nothing inside, and you’ll get only $1000 at the end of the day.
So will you choose to take only the opaque box or would you choose to take both boxes?
Chances are, if you chose to take the opaque box only, you’d think that people who chose to take both boxes are crazy. Since you’ll probably get $1million, why bother with two boxes where it’s highly likely that you’ll get only $1000?
Yet, those who think that it is perfectly rational to choose both boxes will think that it’s crazy to choose only one box and risk losing all of one’s money.
So who’s crazy and who’s rational?
Well, before I say anything more, I think it’s very very interesting that a study was conducted and people are generally split 50-50 over which option to choose from.
So back to the question: who’s crazy and who’s rational?
Well, if you chose to take only the opaque box, chances are you made your decision using the Expected Value principle. The expected value is calculated using the formula:
Expected Value = Expected Gain x Probability of Acquiring It
Using the Expected Value principle, there are four possible scenarios:
- Choose opaque box & $1m put: Get $1m. Expected value = $1m * 99.99…% = 999,999.999…
- Choose opaque box & nothing put: Get $0. Expected value = $0 * 0.000001 % = 0
- Choose both boxes & $1m put: Get $1.001m. Expected value = $1.001m * 0.0000001% = 0.00000001
- Choose both boxes & nothing put: $1000. Expected value = $1000 * 99.99…% = 999.9999…
From the above four scenarios, only scenario one yields the highest expected value. And so, it makes sense, from this point of view, that one ought to choose only the opaque box. This decision is well justified.
Yet, on the other hand, those who choose both boxes have a well-justified decision too. They are operating on the Dominance principle, or what I like to call, the kiasu (Singlish for ‘scared to lose’) principle.
Anyway, the Dominance principle weighs which decision will yield the best result in all possible cases.
There are four possible scenarios that we can construct:
- Choose 2 boxes & no money put: Get $1000 (Probability 100%)
- Choose 2 boxes & put $1m: Get $1.001million (Probability 0.0001%)
- Choose opaque box & no money put: Get $0, i.e. LOSE $1000 (Probability 0.0001%)
- Choose opaque box & money put in: Get $1.001million (Probability 99.999%)
In which case, choosing both boxes will yield the best result since:
- Choose both boxes and $1m put, get $1.001m > Choose only opaque box, get $1m
- Choose both boxes and nothing put, get $1000 > Choosing only opaque box, lose $1000.
From this perspective, it makes sense to choose two boxes instead of one.
The Newcomb’s Paradox is a paradox because in such a situation, people arrive at two different solutions, each justified with a different principle.
But what justifies the use of one principle? Or what justifies choosing one principle over the other?
At this level, reason begins to break down. The two principles above give weight and value to two completely different things. One places value on the expected value while the other places value on which choice yields the best outcomes in every foreseeable case.
But how do you justify which should be given greater weight? In most cases, attempts to justify one value over another already assumes the superiority (or inferiority) of one value. So that wouldn’t be a fair assessment. For example, if I say that it’s better to favour the best outcome in all possible cases rather than the expected value because I will always get some money at the end regardless of the outcome, then I’m implicitly already giving greater value to ‘the best outcome in all possible cases’.
For this reason, a sincere neutral assessment will not be able to answer which principle to choose. That’s why people are divided 50-50 over this issue. We can’t rationally justify why we ought to value one thing over the other.
This is where some level of “irrationality” comes in. We choose one principle over the other for non-rational reasons – gut feeling, our upbringing, our cultural/intellectual context, etc. These things shape us to develop a positive bias for one, and a negative bias for the other.
On the broader picture, we all value different things for different reasons. And even if we can agree that we value a set of things, people rank the things they value differently. I value quiet walks, while others may value loud parties. I value philosophical discussions more than discussions on sports; while others may value discussions on sports more than philosophical discussions.
We may judge others as being worse than us for valuing some things more than we do. But, as I have mentioned earlier, such judgements are unfair judgements because we are judging them from the perspective of our own order of value. And yet, there really is no means for us to justify why one thing should be valued more than another without already assuming the superiority of one value. Why? Because of some level of irrationality going on in the background that makes us favour one value more than another.
So, the next time someone seems to have made an apparently irrational decision, if that person is capable of rationally justifying the decision – even if you still do not agree with it – that’s because we’re making our decisions while placing value on different things or value something more than others, and there’s really no way of properly justifying why we choose to value one thing over another because of non-rational factors.