What We Owe The Future: Difference between revisions

no edit summary
No edit summary
Tags: Mobile edit Mobile web edit
No edit summary
Tags: Mobile edit Mobile web edit
Line 1: Line 1:
{{a|book review|}}It took me a while to put my finger on what was so irritating about this book, but there’s a patronising glibness about it, and it is positively jammed full of the sort of thought experiments (imagine you had to live the life of every sentient being on the planet kind of thing) that give philosophy undergraduates a bad name.
{{a|book review|}}It took me a while to put my finger on what was so irritating about this book. To be sure there’s a patronising glibness about it: it is positively jammed full of the sort of thought experiments (“imagine you had to live the life of every sentient being on the planet” kind of thing) that give [[philosophy]] undergraduates a bad name.


{{Author|William MacAskill}} is, as best as I can make out, barely out of undergraduate [[philosophy]] class, still hasn’t left the university, and strikes me as a singularly unlikely person to be dispensing cosmic advice for the planet’s distant future.
{{Author|William MacAskill}} is, as best as I can make out, barely out of undergraduate [[philosophy]] class himself and hasn’t yet left the university, a thirty-something ethics lecturer should strike everyone but himself as an unlikely source of cosmic advice for the planet’s distant future. So it proves.


But ultimately it is the sub-Sagan, sub-Harari style
But ultimately it is MacAskill’s sub-Harari wiser-than-thou top-down moral counselling that grates: humanity needs to solve the problems of the future centrally and this requires brainy people in the academy, like MacAskill, to do it. And though the solution might be at the great expense of all you stupid nose-breathing oxygen wasters out there, it is for your own, and the future’s, good.
of wiser-than-thou top-down moral counselling that really grates: humanity needs to solve the problems of the future centrally and this requires brainy people in the academy, like me, to do it. And the answer might be at the great expense of all you stupid nose-breathing oxygen wasters out there.


We should sacrifice you lot — birds in the hand — for your far-distant descendants — birds in a bush that may or may not be there in 500m years.
We should sacrifice you lot — birds in the hand — for your far-distant descendants — birds in a bush that may or may not be there in 500m years.
Line 10: Line 9:
Thanks — but no thanks.
Thanks — but no thanks.


It is not at all clear that we can do anything to influence the distant future (a meteor could wipe us out any time), nor why organisms around ''now'' should give the merest flying hoot for the future of their species in 500 million years which, if it survives, will have doubtlessly evolved beyond all recognition.
It is not at all clear what anyone can do to influence the unknowable distant future a meteor could wipe us out any time — but tricksy probability calculations sure aren’t going to help. Nor does MacAskill ever say why organisms who are around ''now'' should give the merest flying hoot for the future of a species which, if it survives at all, will have doubtlessly evolved beyond all recognition in 500 million years. How donating a tiger of your income now is supposed to make any difference that far out just beggars belief. Is he ignorant of, or has he just misunderstood the basic lessons of chaos theory? Neither speak well of his academic credentials.


{{Quote|Quick side bar: [[Probabilities]] are suitable for closed, bounded systems with a ''complete'' set of ''known'' outcomes. The probability of rolling a six is ⅙ because a die has six equal sides, is equally likely to land on any side, and must land on one, and no other outcome is possible. ''This is not how most things in life work''. Probabilities work for [[finite game]]s. ''The future is in no sense a finite game''. It is unbounded, ambiguous, incomplete, the range of possible outcomes are not known and may as well be infinite. ''You can't calculate probabilities about it''. {{Author|Gerd Gigerenzer}} would say it is a situation of ''uncertainty'', not ''risk''. ''Expectation theory is worthless.''}}
{{Quote|Quick side bar: [[Probabilities]] are suitable for closed, bounded systems with a ''complete'' set of ''known'' outcomes. The probability of rolling a six is ⅙ because a die has six equal sides, is equally likely to land on any side, and must land on one, and no other outcome is possible. ''This is not how most things in life work''. Probabilities work for [[finite game]]s. ''The future is in no sense a finite game''. It is unbounded, ambiguous, incomplete, the range of possible outcomes are not known and may as well be infinite. ''You can't calculate probabilities about it''. {{Author|Gerd Gigerenzer}} would say it is a situation of ''uncertainty'', not ''risk''. ''Expectation theory is worthless.''}}