Bayesian prior: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
Line 14: Line 14:


At the same time, everyone else will conclude that you have no idea about aesthetics and a fairly shaky grasp even of Bayesian statistics.
At the same time, everyone else will conclude that you have no idea about aesthetics and a fairly shaky grasp even of Bayesian statistics.
====The Monty Hall problem ====
The neatest illustration of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow Deal or No Deal and famously articulated in a letter to ''Parade'' Magazine as follows:
{{quote|
Suppose you're on a game show, and youʼre given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows whatʼs behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?}}


Bayesian probabilities are a clever way of deducing, a priori, that we are all screwed. If you find yourself at or near the beginning of something, such as Civilisation, a bayesian model will tell you it will almost certainly end soon.  
If you have not encountered the problem before, the intuitive answer is to say, no: each door carried an equal probability, 1/3, of containing the car before the host opened a door, and each carries an equal probability, 1/2, afterward. To be sure the odds are better now than they were, but one should still be indifferent as to whether to switch.
 
Bayes says no: since the host will never open the door you chose, nor the door concealing the car, this new information tells you something about the remaining choice. Your original choice keeps its original odds of 1/3; the odds as between the other two doors change, from 1/3 each to 0/3 for the open door, which definitely doesnʼt hold the car, 2/3 for the closed one, which still might. So you should switch doors. You exchange a 1/3 risk of being right for a 1/3 risk of being wrong.
 
This proposal outrages some people, at first. Apparently, even statisticians. But it is true. It becomes more intuitive if you adjust the thought experiment so there are one ''thousand'' doors, not three, and after your 1/1000 choice the host reveals 998 of the other doors to reveal goats and leaves one shut. ''Now'' would you switch? Clearly, the other door now accounts for 999/1000 of the original options.
 
Or you could just experiment.
 
====Bayesian probabilies are probabilities ====
Now all of this is well and good and unimpeachable if the nomological conditions for probabilities hold. There needs to be a static, finite sample space — 1000 doors —and a finite and known number of discrete outcomes — goat or car. It also works for coins dice, cards and games of chance. These are simple systems, easily reduced to nomological machines
====The doomsday problem ====
Bayesian probabilities are a clever way of deducing, [[a priori]] , that we are all screwed. If you find yourself at or near the beginning of something, such as Civilisation, a bayesian model will tell you it will almost certainly end soon.  


It works on elementary probability and can be illustrated simply.  
It works on elementary probability and can be illustrated simply.  

Revision as of 10:09, 21 March 2024

The design of organisations and products


Making legal contracts a better experience
Index — Click ᐅ to expand:

Comments? Questions? Suggestions? Requests? Insults? We’d love to 📧 hear from you.
Sign up for our newsletter.

I could go on and on about the failings of Shakespeare ... but really I shouldn’t need to: the Bayesian priors are pretty damning. About half of the people born since 1600 have been born in the past 100 years, but it gets much worse than that. When Shakespeare wrote, almost all of Europeans were busy farming, and very few people attended university; few people were even literate—probably as low as about ten million people. By contrast, there are now upwards of a billion literate people in the Western sphere. What are the odds that the greatest writer would have been born in 1564?

Chauncey Gardiner’s “sophomore college blog”, quoted in Michael Lewis’ Going Infinite

You ever seen the dude from FTX? The one that went to prison? That dude shouldn’t be talking about Shakespeare.

—Mike Tyson

Bayesian prior
beɪzˈiːən ˈpraɪə (n.)
A way to incorporate existing knowledge or beliefs about a parameter into statistical analysis. For example, if you believe that

(a) all playwrights can be objectively ranked according to independent, observable criteria;
(b) the quality of those playwrights in a given sample will be normally distributed;

and you think the best way of assessing the quality of dramas is by statistical analysis, then

(i) you have already made several category errors, should not be talking about art, and if you are, no-one should be listening; but
(ii) if, nonetheless, you are, and they are, and you are trying to estimate the statistical likelihood of a specific Elizabethan playwright being the best in history, then your knowledge that there were vastly fewer playwrights active in the Elizabethan period than have existed in all of history until now — which is a Bayesian prior distribution — might help you conclude that the odds of that Elizabethan playwright really being the best are vanishingly low.

At the same time, everyone else will conclude that you have no idea about aesthetics and a fairly shaky grasp even of Bayesian statistics.

The Monty Hall problem

The neatest illustration of how Bayesian priors are meant to work is the “Monty Hall” problem, named for the ghost of the gameshow Deal or No Deal and famously articulated in a letter to Parade Magazine as follows:

Suppose you're on a game show, and youʼre given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows whatʼs behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?

If you have not encountered the problem before, the intuitive answer is to say, no: each door carried an equal probability, 1/3, of containing the car before the host opened a door, and each carries an equal probability, 1/2, afterward. To be sure the odds are better now than they were, but one should still be indifferent as to whether to switch.

Bayes says no: since the host will never open the door you chose, nor the door concealing the car, this new information tells you something about the remaining choice. Your original choice keeps its original odds of 1/3; the odds as between the other two doors change, from 1/3 each to 0/3 for the open door, which definitely doesnʼt hold the car, 2/3 for the closed one, which still might. So you should switch doors. You exchange a 1/3 risk of being right for a 1/3 risk of being wrong.

This proposal outrages some people, at first. Apparently, even statisticians. But it is true. It becomes more intuitive if you adjust the thought experiment so there are one thousand doors, not three, and after your 1/1000 choice the host reveals 998 of the other doors to reveal goats and leaves one shut. Now would you switch? Clearly, the other door now accounts for 999/1000 of the original options.

Or you could just experiment.

Bayesian probabilies are probabilities

Now all of this is well and good and unimpeachable if the nomological conditions for probabilities hold. There needs to be a static, finite sample space — 1000 doors —and a finite and known number of discrete outcomes — goat or car. It also works for coins dice, cards and games of chance. These are simple systems, easily reduced to nomological machines

The doomsday problem

Bayesian probabilities are a clever way of deducing, a priori , that we are all screwed. If you find yourself at or near the beginning of something, such as Civilisation, a bayesian model will tell you it will almost certainly end soon.

It works on elementary probability and can be illustrated simply.

Imagine there are two opaque barrels. One contains ten pool balls and the other contains ten thousand, in each case sequentially numbered from 1. You cannot tell which barrel is which.

A magician draws a ball with a seven on it from one barrel.

What are the odds that this came from the barrel with just ten balls?

Naive probability says that since both barrels contain a 7 ball, it is 50:50. Bayesian probability takes the additional fact we know about each barrel: the odds of drawing a seven from one barrel is 1 in 10, and from the other is 1 in 10,000, and concludes it is 1,000 times more likely that the 7 came from the barrel with just ten balls.

The proof of this intuition is if you drew ball 235, there would be no chance it came from the ten-ball barrel.

This logical reasoning is, obviously, sound. The same logic behind the “three door choice problem

How do we get from this to the imminence of the apocalypse?

Well, the start of your life is, across the cosmic stretch of human existence, like a random draw with a sequentially numbered birth year on each ball.

Now imagine an array of a million hypothetical barrels containing balls engraved with sequentially numbered years, beginning at the dawn of civilisation which, for arguments sake, we shall call the start of te Christian era .

The first barrel had just one ball, with 0 on it — the next has two: 0 and -3299, and so on, up to one million years after the fall of Troy.

Let's say your birth year was the 6001st after Troy. What are the odds that your birthday would be drawn at random from each of the million barrels? We know the odds for the first 6,000: zero. None of them have a ball 6001. Across the remaining 994,000 the probabilies fall from 1/6001 to 1/1,000,000. Using the same principle as above we can see that the probability is clustered somewhere nearer the “short end” (near 6001) than the “long end” (1,000,000).

If we assume your birthdate is drawn randomly from all the birthdates available to you then this sort of implies everything is likely to go pltetas arriba sooner rather than later.

This is rather like a malign inversion of the Lindy effect.


Assessing the probability that your ball came from a given barrel is somewhat complicated but clearly we can rule out barrels 1-6,000, andthe higher your birth year, the more probability there is that it resides in a higher barrel.


See also