Bayesian prior: Difference between revisions

no edit summary
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
Line 15: Line 15:
At the same time, everyone else will conclude that you have no idea about aesthetics and a fairly shaky grasp even of Bayesian statistics.
At the same time, everyone else will conclude that you have no idea about aesthetics and a fairly shaky grasp even of Bayesian statistics.
====The Monty Hall problem ====
====The Monty Hall problem ====
The neatest illustration of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow Deal or No Deal and famously articulated in a letter to ''Parade'' Magazine as follows:
The neatest illustration of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow ''Deal or No Deal'' and famously articulated in a letter to ''Parade'' Magazine as follows:
{{quote|
{{quote|
Suppose you're on a game show, and youʼre given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows whatʼs behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?}}
Suppose you're on a game show, and youʼre given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows whatʼs behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?}}


If you have not encountered the problem before, the intuitive answer is to say, no: each door carried an equal probability, 1/3, of containing the car before the host opened a door, and each carries an equal probability, 1/2, afterward. To be sure the odds are better now than they were, but one should still be indifferent as to whether to switch.  
If you have not seen it before, intuitively you may say, ''no'': each door carried an equal probability before the host opened a door — 1/3 — and each carries an equal probability afterward — 1/2. While the odds are ''better'' now, they’re still even between each remaining door. So it should not matter whether you switch or not.


Bayes says no: since the host will never open the door you chose, nor the door concealing the car, this new information tells you something about the remaining choice. Your original choice keeps its original odds of 1/3; the odds as between the other two doors change, from 1/3 each to 0/3 for the open door, which definitely doesnʼt hold the car, 2/3 for the closed one, which still might. So you should switch doors. You exchange a 1/3 risk of being right for a 1/3 risk of being wrong.  
Bayesian probability shows this intuition to be ''wrong''. The host will never open the door you chose, nor the one concealing the car, so while this new information tells you nothing about your original choice: you already knew at least one of the other doors (and possibly both of them) didn’t contain the car. It does, however, tell you something about the ''remaining'' choice. The odds as between the other two doors change, from 1/3 each to 0/3 for the open door — we now know it definitely doesnʼt hold the car — and 2/3 for the closed one, which still might.  


This proposal outrages some people, at first. Apparently, even statisticians. But it is true. It becomes more intuitive if you adjust the thought experiment so there are one ''thousand'' doors, not three, and after your 1/1000 choice the host reveals 998 of the other doors to reveal goats and leaves one shut. ''Now'' would you switch? Clearly, the other door now accounts for 999/1000 of the original options.  
So you ''should'' switch doors. You exchange a 1/3 chance of being ''right'' for a 1/3 risk of being wrong.  


Or you could just experiment.  
This proposal outrages some people, at first. Apparently, even statisticians. But it is true. It is easier to see if you imagine instead there are ''one thousand'' doors, not three, and after your first pick the host opens 998 of the other doors.
 
Here you know you were almost certainly wrong first time, so if every possible wrong answer but one is revealed to you it stands more obviously to reason that the other door which accounts for 999/1000 of the original options, is the one holding the car.


====Bayesian probabilies are probabilities ====
====Bayesian probabilies are probabilities ====
Now all of this is well and good and unimpeachable if the nomological conditions for probabilities hold. There needs to be a static, finite sample space — 1000 doors —and a finite and known number of discrete outcomes — goat or car. It also works for coins dice, cards and games of chance. These are simple systems, easily reduced to nomological machines
Now all of this is well and good and unimpeachable if the [[nomological machine|conditions]] in which probabilities hold are present: a static, finite “sample space” — 1000 doors a finite and known number of discrete outcomes — goat or car — and a lack of intervening causes like moral (immoral?) agents who can capriciously affect the random outcomes. It works well for carefully controlled games of chance involving flipped coins, thrown dice, randomly drawn playing cards and, of course ''Deal or No Deal''. They are all simple systems, easily reduced to “[[nomological machine]]s”
====The doomsday problem ====
====The doomsday problem ====
Bayesian probabilities are a clever way of deducing, [[a priori]] , that we are all screwed. If you find yourself at or near the beginning of something, such as Civilisation, a bayesian model will tell you it will almost certainly end soon.  
Bayesian probabilities are a clever way of deducing, [[a priori]] , that we are all screwed. If you find yourself at or near the beginning of something, such as Civilisation, a bayesian model will tell you it will almost certainly end soon.