Bayesian prior: Difference between revisions

no edit summary
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
Line 22: Line 22:
They take us from a surprising proof of how the odds work in a game of chance — that will help you choose wisely between goats and cars — but once it departs the tightly-controlled conditions of that statistical experiment, it is easily misapplied and may get badly lost in evaluating Shakespeare, our debt to distant future generations, and the onrushing [[apocalypse]], courtesy of which, it seems to say, there won’t ''be'' any distant future generations to worry about anyway.
They take us from a surprising proof of how the odds work in a game of chance — that will help you choose wisely between goats and cars — but once it departs the tightly-controlled conditions of that statistical experiment, it is easily misapplied and may get badly lost in evaluating Shakespeare, our debt to distant future generations, and the onrushing [[apocalypse]], courtesy of which, it seems to say, there won’t ''be'' any distant future generations to worry about anyway.
====Goats and sportscars====
====Goats and sportscars====
{{Drop|T|he neatest illustration}} of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow ''Deal or No Deal'' and famously articulated in a letter to ''Parade'' Magazine as follows:
{{Drop|T|he neatest illustration}} of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow ''Deal or No Deal'':
{{quote|
{{quote|
Suppose you're on a game show, and youʼre given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows whatʼs behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?}}
A game show invites a contestant to choose a prize by opening one of three doors. One door conceals s a sports car, the other two conceal goats. [''Why goats? — Ed'']


If you have not seen it before, intuitively you may say, ''no'': each door carried an equal probability before the host opened a door — 1/3 — and each carries an equal probability afterward — 1/2. While the odds are ''better'' now, they’re still even between each remaining door. So it should not matter whether you switch or not.
Once the contestant has made her choice, the host theatrically opens one of the ''other'' doors to reveal a goat.  


Bayesian probability shows this intuition to be ''wrong''. The host will never open the door you chose, nor the one concealing the car, so while this new information tells you nothing about your original choice: you already knew at least one of the other doors (and possibly both of them) didn’t contain the car. It does, however, tell you something about the ''remaining'' choice. The odds as between the other two doors change, from 1/3 each to 0/3 for the open door — we now know it definitely doesnʼt hold the car — and 2/3 for the closed one, which still might.  
There are now two doors left; one concealing a goat, one concealing the car. Assume the host will never open the door the contestant chose, nor the one concealing the car, and the car will not move.


So you ''should'' switch doors. You exchange a 1/3 chance of being ''right'' for a 1/3 risk of being wrong.
Should the contestant stick with her original choice, change to the other door, or should she be indifferent?}}


This proposal outrages some people, at first. Apparently, even statisticians. But it is true. It is easier to see if you imagine instead there are ''one thousand'' doors, not three, and after your first pick the host opens 998 of the other doors.  
If you have not seen it before, intuitively you may say, well, each door carried an equal probability at the beginning — 1/3 — and after the reveal — 1/2 —so while the player’s odds are now ''better'', they are the same whether she sticks or twists, so she should be indifferent.
 
Bayesian probability theory shows this intuition to be ''wrong''.
 
The new information tells you nothing about your original choice: you already knew it may or may not contain the car. It does, however, tell you something about the doors you didn’t choose. The odds as between the other two doors change, from 1/3 each to 0/3 for the open door — we now know it definitely doesnʼt hold the car — and 2/3 for the closed one, which still might.
 
The probabilities for your remaining options are therefore 1/3 for your original choice and 2/3 for the other remaining door.
 
So you ''should'' switch doors. You exchange a 1/3 chance of being ''right'' for a 1/3 risk of being wrong. This proposal outrages some people, at first. Apparently, even statisticians. But it is true.  
 
It is easier to see if instead there are ''one thousand'' doors, not three, and after your first pick the host opens 998 of the other doors.  


Here you know you were almost certainly wrong first time, so if every possible wrong answer but one is revealed to you it stands more obviously to reason that the other door which accounts for 999/1000 of the original options, is the one holding the car.
Here you know you were almost certainly wrong first time, so if every possible wrong answer but one is revealed to you it stands more obviously to reason that the other door which accounts for 999/1000 of the original options, is the one holding the car.