Bayesian prior: Difference between revisions

no edit summary
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
Line 6: Line 6:
:—Mike Tyson
:—Mike Tyson
}}
}}
{{dpn|beɪzˈiːən ˈpraɪə|n|}}A way to incorporate existing knowledge or beliefs about a parameter into statistical analysis. For example, if you believe that
{{dpn|beɪzˈiːən ˈpraɪə|n|}}A way to incorporate existing knowledge or beliefs about a parameter into statistical analysis.
:(a) all playwrights can be objectively ranked according to independent, observable criteria;
:(b) the quality of those playwrights in a given sample will be normally distributed;
and you think the best way of assessing the quality of dramas is by statistical analysis, then
:(i) you have already made several category errors, should not be talking about art, and if you are, no-one should be listening; but
:(ii) if, nonetheless, you are, and they are, and you are trying to estimate the statistical likelihood of a specific Elizabethan playwright being the best in history, then your knowledge that there were vastly fewer playwrights active in the Elizabethan period than have existed in all of history until now — which is a Bayesian prior distribution — might help you conclude that the odds of that Elizabethan playwright really being the best are vanishingly low.  


At the same time, everyone else will conclude that you have no idea about aesthetics and a fairly shaky grasp even of Bayesian statistics.
For example, if you believe that:
====The Monty Hall problem ====
{{L3}}All playwrights can be objectively ranked according to independent, observable criteria; <li>
The neatest illustration of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow ''Deal or No Deal'' and famously articulated in a letter to ''Parade'' Magazine as follows:
The quality of those playwrights in a given sample will be normally distributed; and <li>
The best way of assessing the quality of dramas is by statistical analysis </ol>
Then:
{{L3}}You have already made several category errors, should not be talking about art, and if you are, no-one should be listening; but <li>
If nonetheless you still are, and they still are, and you are trying to estimate the statistical likelihood of a specific Elizabethan playwright being the best in history, then your knowledge that there were vastly fewer playwrights active in the Elizabethan period than have existed in all of history until now — which is a Bayesian “prior distribution” — might help you conclude that the odds of that Elizabethan playwright really being the best are vanishingly low.</ol>
 
At the same time, everyone else will conclude that you have no idea about literature and a shaky grasp even of Bayesian statistics.
 
{{Drop|B|ayesian statistics have}}, in our dystopian techno-determinist age, a lot to answer for.
 
They take us from a surprising proof of how the odds work in a game of chance — that will help you choose wisely between goats and cars — but once it departs the tightly-controlled conditions of that statistical experiment, it is easily misapplied and may get badly lost in evaluating Shakespeare, our debt to distant future generations, and the onrushing [[apocalypse]], courtesy of which, it seems to say, there won’t ''be'' any distant future generations to worry about anyway.
====Goats and sportscars====
{{Drop|T|he neatest illustration}} of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow ''Deal or No Deal'' and famously articulated in a letter to ''Parade'' Magazine as follows:
{{quote|
{{quote|
Suppose you're on a game show, and youʼre given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows whatʼs behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?}}
Suppose you're on a game show, and youʼre given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows whatʼs behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?}}
Line 29: Line 36:
Here you know you were almost certainly wrong first time, so if every possible wrong answer but one is revealed to you it stands more obviously to reason that the other door which accounts for 999/1000 of the original options, is the one holding the car.
Here you know you were almost certainly wrong first time, so if every possible wrong answer but one is revealed to you it stands more obviously to reason that the other door which accounts for 999/1000 of the original options, is the one holding the car.


====Bayesian probabilies are probabilities ====
Lesson: use what you already know about history, and your place in it, to update your choices. This ought not to be such a revelation. Count cards. Update your predictions and become a “super forecaster”.
Now all of this is well and good and unimpeachable if the [[nomological machine|conditions]] in which probabilities hold are present: a static, finite “sample space” — 1000 doors — a finite and known number of discrete outcomes — goat or car — and a lack of intervening causes like moral (immoral?) agents who can capriciously affect the random outcomes. It works well for carefully controlled games of chance involving flipped coins, thrown dice, randomly drawn playing cards and, of course ''Deal or No Deal''. They are all simple systems, easily reduced to “[[nomological machine]]s”
 
====Bayesian probabilities are models ====
{{Drop|N|ow, all of}} this is well and good and unimpeachable if the [[nomological machine|conditions]] in which probabilities hold are present: a static, finite “sample space” — 3, 10 or  1000 doors — a finite and known number of discrete outcomes — goat or car — and a lack of intervening causes like moral (immoral?) agents who can capriciously affect the random outcomes.  
 
It works well for carefully controlled games of chance involving flipped coins, thrown dice, randomly drawn playing cards and, of course ''Deal or No Deal''. They are all simple systems, easily reduced to “[[nomological machine]]s”
 
When you apply it to unbounded complex systems involving, well, people, it works less well.
 
====The doomsday problem ====
====The doomsday problem ====
Bayesian probabilities are a clever way of deducing, [[a priori]] , that we are all screwed. If you find yourself at or near the beginning of something, such as Civilisation, a bayesian model will tell you it will almost certainly end soon.  
Bayesian probabilities, if misused, can lead statistics professors to the [[a priori]] deduction that we are all screwed.  
 
{{Quote|
{{D|A priori||adj|}}
Following logically from existing premises. Necessarily so. Not dependent on observation or falsifiable evidence.}}
 
Where it is not possible to gather the necessary evidence, philosophers have a weakness for ''[[a priori]]'' arguments. They are prevalent in metaphysical enquiries: Pascal ’s wager, cogito, ergo sum, the argument from design. Any argument based purely on probabilities is a priori: the general principle is extrapolated to predict a factual answer. A specific
If you find yourself at or near the beginning of something, such as Civilisation, a bayesian model will tell you it will almost certainly end soon.  


It works on elementary probability and can be illustrated simply.  
It works on elementary probability and can be illustrated simply.