What We Owe The Future: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
No edit summary
Tags: Mobile edit Mobile web edit
No edit summary
Tags: Mobile edit Mobile web edit
Line 7: Line 7:
To take care.
To take care.
:—Roger Waters, ''Your Possible Pasts''}}
:—Roger Waters, ''Your Possible Pasts''}}
===Be careful what you wish for===
Per the [[entropy|second law of thermodynamics]] but pace Pink Floyd, there is but ''one'' possible past, ''one'' possible now, and an infinite array of  possible futures stretching out into an unknown black void. Some short, some long, some dystopian, some enlightened. Some cut off by apocalypse, some fading gently into warm entropic soup.
Per the [[entropy|second law of thermodynamics]] but pace Pink Floyd, there is but ''one'' possible past, ''one'' possible now, and an infinite array of  possible futures stretching out into an unknown black void. Some short, some long, some dystopian, some enlightened. Some cut off by apocalypse, some fading gently into warm entropic soup.


William MacAskill’s premise is, however you look at it, there are more people in our species’ future than there are in its present, those in its present have a duty of care to those yet to come that swamps the short term interests of the present assembled.
William MacAskill’s premise is this: barring near-term cataclysm, there are so many more people in our species’ future than in its present, that our  duty of care to those yet to come swamps our own short-term interests.


We are minding the shop not just for our children and grandchildren but for generations millennia hence. ''Thousands'' of millennia hence.
We are minding the shop not just for our children and grandchildren but for generations millennia hence. ''Thousands'' of millennia hence, in deep time.


But here is his first logical lacuna. The causal chain behind us stretches unitarily unbroken back to where our records begin. Behind us the probability of each successive step is 1. As the saying goes, it is easy to be wise in hindsight.  
But here is his first logical lacuna. The causal chain behind us stretches unbroken back to where records begin. Behind us, the probability of each successive step is 1. As the saying goes, it is easy to be wise in hindsight.  


But in front of us alternate possibilities branch into the infinite. Over a generation or two it is easy enough to anticipate and provide for our own progeny. [[Darwin’s Dangerous Idea|Darwin’s dangerous algorithm]] naturally wires us to do this.  
But in front of us, alternate possibilities branch into the infinite. Over a generation or two it is easy enough to anticipate and provide for our own progeny. [[Darwin’s Dangerous Idea|Darwin’s dangerous algorithm]] naturally wires us to do this.  


But over a million years of non-linear interactions in our hypercomplex ecosystems, our possible futures are so so uncertain and disparate that we cannot possibly know how decisions we make now will reverberate in the feedback loops and subroutines of this colossal organic machine.
But over million years — being the average lifespan of a mammalian species, MacAskill informs us — the gargantuan mass of non-linear interactions between trillions of co-evolving organisms in our hypercomplex ecosystem, mean our possible future is so [[uncertain]] and disparate that we cannot possibly predict the effect of our actions today.


And, inany case, any decisions we make now to prefer one outcome is surely to disfavour an infinity of others. Who are we to try to play God?
And, in any case, any decisions we make now to prefer one outcome is surely to disfavour an infinity of others. Don’t all these possible futures have an entitlement for consideration?  Is this really altruism or motivated selfish behaviour? Aren’t we playing God?


[[William MacAskill]] is undoubtedly intelligent and well-read, and applies his polymathic range to scoping out this million-year-old argument. He is probably ''too'' well-read. You sense it would do him the world of good to put the books down spend some time pulling pints or labouring on a building site. Get some education from the school of life.
[[William MacAskill]] is undoubtedly intelligent and well-read, and applies his polymathic range to scoping out this million-year-old argument. He is probably ''too'' well-read. You sense it would do him the world of good to put the books down spend some time pulling pints or labouring on a building site. Get some education from the school of life.
Line 25: Line 27:
Still, it took me a while to put my finger on what was so irritating about this book. To be sure, there’s a patronising glibness about it: it is positively jammed full of the sort of thought experiments (“imagine you had to live the life of every sentient being on the planet” kind of thing) that give [[philosophy]] undergraduates a bad name.
Still, it took me a while to put my finger on what was so irritating about this book. To be sure, there’s a patronising glibness about it: it is positively jammed full of the sort of thought experiments (“imagine you had to live the life of every sentient being on the planet” kind of thing) that give [[philosophy]] undergraduates a bad name.


{{Author|William MacAskill}} is, as best as I can make out, barely out of undergraduate [[philosophy]] class himself and hasn’t yet left the university. A thirty-something ethics lecturer would strike most people (other than himself) as an unlikely source of cosmic advice for the planet’s distant future. So it proves.
 
MacAskill is barely out of undergraduate [[philosophy]] class himself. He hasn’t yet left the university. A thirty-something meta-ethics lecturer would strike most people (other than himself) as an unlikely source of cosmic advice for the planet’s distant future. So it proves.


But ultimately it is MacAskill’s sub-[[Yuval Noah Harari|Harari}}, wiser-than-thou, top-down moral counselling that grates: humanity needs to solve the problems of the future centrally; this requires brainy people from the academy, like MacAskill, to do it. And though the solution might be at the great expense of all you mouth-breathing oxygen wasters out there, it is for the future’s good.
But ultimately it is MacAskill’s sub-[[Yuval Noah Harari|Harari}}, wiser-than-thou, top-down moral counselling that grates: humanity needs to solve the problems of the future centrally; this requires brainy people from the academy, like MacAskill, to do it. And though the solution might be at the great expense of all you mouth-breathing oxygen wasters out there, it is for the future’s good.


We should sacrifice you lot — birds in the hand — for your far-distant descendants — birds in a bush that may or may not be there in 500m years.
We should sacrifice you lot — birds in the hand — for your far-distant descendants — birds in a bush who may or may not be there in 500m years.


Thanks — but no thanks.
Thanks — but no thanks.
Line 35: Line 38:
It is not at all clear what anyone can do to influence the unknowably distant future — a meteor could wipe us out any time — but in any case expected value probability calculations sure aren’t going to help. Nor does MacAskill ever say ''why'' organisms who are around ''now'' should give the merest flying hoot for a species which, if it survives at all, will have doubtlessly evolved beyond all recognition in 500 million years.  
It is not at all clear what anyone can do to influence the unknowably distant future — a meteor could wipe us out any time — but in any case expected value probability calculations sure aren’t going to help. Nor does MacAskill ever say ''why'' organisms who are around ''now'' should give the merest flying hoot for a species which, if it survives at all, will have doubtlessly evolved beyond all recognition in 500 million years.  


How, or why, donating a tithe of your income now is supposed to benefit the race of pan-dimensional hyperbeings we will have evolved into by then, he does not say. Is he unaware, or just uncomprehending, of the basic principles of [[chaos theory]]? Causality may or may not be true, but forward progress in an open ecosystem of independent agents is [[non-linear]]. There is no “if-this-then-that” over five years, let alone fifty, let alone ''five million''. We are in the lap of the gods.
How, or why, donating a tithe of your income now is supposed to benefit the race of pan-dimensional hyperbeings we will have evolved into by then, he does not say. Causality may or may not be true, but forward progress in an open ecosystem of independent agents is [[non-linear]]. There is no “if-this-then-that” over five years, let alone fifty, let alone ''five million''. We are in the lap of the gods.


{{Quote|Quick side bar: [[Probabilities]] are suitable for closed, bounded systems with a ''complete'' set of ''known'' outcomes. The probability of rolling a six is ⅙ because a die has six equal sides, is equally likely to land on any side, and must land on one, and no other outcome is possible. ''This is not how most things in life work''. Probabilities work for [[finite game]]s. ''The future is in no sense a finite game''. It is unbounded, ambiguous, incomplete, the range of possible outcomes are not known and may as well be infinite. ''You can't calculate probabilities about it''. {{Author|Gerd Gigerenzer}} would say it is a situation of ''uncertainty'', not ''risk''. ''Expectation theory is worthless.''}}
{{Quote|Quick side bar: [[Probabilities]] are suitable for closed, bounded systems with a ''complete'' set of ''known'' outcomes. The probability of rolling a six is ⅙ because a die has six equal sides, is equally likely to land on any side, and must land on one, and no other outcome is possible. ''This is not how most things in life work''. Probabilities work for [[finite game]]s. ''The future is in no sense a finite game''. It is unbounded, ambiguous, incomplete, the range of possible outcomes are not known and may as well be infinite. ''You can't calculate probabilities about it''. {{Author|Gerd Gigerenzer}} would say it is a situation of ''uncertainty'', not ''risk''. ''Expectation theory is worthless.''}}

Revision as of 11:58, 27 November 2022

The Jolly Contrarian’s book review service™
What we owe the future.jpg
Index: Click to expand:

Comments? Questions? Suggestions? Requests? Insults? We’d love to 📧 hear from you.
Sign up for our newsletter.

They flutter behind you, your possible pasts:
Some bright-eyed and crazy,
Some frightened and lost.
A warning to anyone still in command
Of their possible future
To take care.

—Roger Waters, Your Possible Pasts

Be careful what you wish for

Per the second law of thermodynamics but pace Pink Floyd, there is but one possible past, one possible now, and an infinite array of possible futures stretching out into an unknown black void. Some short, some long, some dystopian, some enlightened. Some cut off by apocalypse, some fading gently into warm entropic soup.

William MacAskill’s premise is this: barring near-term cataclysm, there are so many more people in our species’ future than in its present, that our duty of care to those yet to come swamps our own short-term interests.

We are minding the shop not just for our children and grandchildren but for generations millennia hence. Thousands of millennia hence, in deep time.

But here is his first logical lacuna. The causal chain behind us stretches unbroken back to where records begin. Behind us, the probability of each successive step is 1. As the saying goes, it is easy to be wise in hindsight.

But in front of us, alternate possibilities branch into the infinite. Over a generation or two it is easy enough to anticipate and provide for our own progeny. Darwin’s dangerous algorithm naturally wires us to do this.

But over million years — being the average lifespan of a mammalian species, MacAskill informs us — the gargantuan mass of non-linear interactions between trillions of co-evolving organisms in our hypercomplex ecosystem, mean our possible future is so uncertain and disparate that we cannot possibly predict the effect of our actions today.

And, in any case, any decisions we make now to prefer one outcome is surely to disfavour an infinity of others. Don’t all these possible futures have an entitlement for consideration? Is this really altruism or motivated selfish behaviour? Aren’t we playing God?

William MacAskill is undoubtedly intelligent and well-read, and applies his polymathic range to scoping out this million-year-old argument. He is probably too well-read. You sense it would do him the world of good to put the books down spend some time pulling pints or labouring on a building site. Get some education from the school of life.

Still, it took me a while to put my finger on what was so irritating about this book. To be sure, there’s a patronising glibness about it: it is positively jammed full of the sort of thought experiments (“imagine you had to live the life of every sentient being on the planet” kind of thing) that give philosophy undergraduates a bad name.


MacAskill is barely out of undergraduate philosophy class himself. He hasn’t yet left the university. A thirty-something meta-ethics lecturer would strike most people (other than himself) as an unlikely source of cosmic advice for the planet’s distant future. So it proves.

But ultimately it is MacAskill’s sub-[[Yuval Noah Harari|Harari}}, wiser-than-thou, top-down moral counselling that grates: humanity needs to solve the problems of the future centrally; this requires brainy people from the academy, like MacAskill, to do it. And though the solution might be at the great expense of all you mouth-breathing oxygen wasters out there, it is for the future’s good.

We should sacrifice you lot — birds in the hand — for your far-distant descendants — birds in a bush who may or may not be there in 500m years.

Thanks — but no thanks.

It is not at all clear what anyone can do to influence the unknowably distant future — a meteor could wipe us out any time — but in any case expected value probability calculations sure aren’t going to help. Nor does MacAskill ever say why organisms who are around now should give the merest flying hoot for a species which, if it survives at all, will have doubtlessly evolved beyond all recognition in 500 million years.

How, or why, donating a tithe of your income now is supposed to benefit the race of pan-dimensional hyperbeings we will have evolved into by then, he does not say. Causality may or may not be true, but forward progress in an open ecosystem of independent agents is non-linear. There is no “if-this-then-that” over five years, let alone fifty, let alone five million. We are in the lap of the gods.

Quick side bar: Probabilities are suitable for closed, bounded systems with a complete set of known outcomes. The probability of rolling a six is ⅙ because a die has six equal sides, is equally likely to land on any side, and must land on one, and no other outcome is possible. This is not how most things in life work. Probabilities work for finite games. The future is in no sense a finite game. It is unbounded, ambiguous, incomplete, the range of possible outcomes are not known and may as well be infinite. You can't calculate probabilities about it. Gerd Gigerenzer would say it is a situation of uncertainty, not risk. Expectation theory is worthless.

This demolishes MacAskill’s foundational premise — applied “expectation theory” is how he draws his conclusions about the plight of the Morlocks of our future — and is enough to trash the book’s thesis in toto.

Does this self-sacrifice for the hereafter also apply to non-sapient beasts, fish and fowls, too? Bushes and trees? If not, why not?

If homo sapiens really is as hopeless a case as MacAskill thinks, who is to say it can redeem itself millennia into the future? What makes Macaskill think future us deserves that chance that present us is blowing so badly? Perhaps it would be better off for everyone else said saintly beasts, fish fowls, bushes and trees) if we just winked out now.

MacAskill’s loopy Futurism appeals to the silicon valley demi-god types who have a weakness for Wagnerian psychodrama and glib a priori sci fi futurism. To MacAskill’s chagrin, deluded fantasist Sam Bankman-Fried is a fan, supporter, and seems to have “altruistically” given away a large portion of his investors’ money to the cause. I wonder what the expected value of that outcome was. You perhaps shouldn’t judge a book by the company it keeps on bookshelves, but still.

If you want sensible and thoughtful writing about the planet and its long term future, try Stewart Brand and Brian Eno and the good folk of the Long Now Foundation. Give this hokum the swerve.

See also