What We Owe The Future: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 24: Line 24:
For, per the [[entropy|second law of thermodynamics]] — ''pace'' dear old Roger Waters — there is just ''one'' possible past, ''one'' possible now, and an ''infinite'' array of possible futures. They stretch out into an unknown black void. Some are short, some long, some dystopian, some enlightened. Some will be cut off by apocalypse, some will fade gently into warm [[Entropy|entropic]] soup.
For, per the [[entropy|second law of thermodynamics]] — ''pace'' dear old Roger Waters — there is just ''one'' possible past, ''one'' possible now, and an ''infinite'' array of possible futures. They stretch out into an unknown black void. Some are short, some long, some dystopian, some enlightened. Some will be cut off by apocalypse, some will fade gently into warm [[Entropy|entropic]] soup.


It is as if MacAskill has got this perfectly backward. He talks about the present as if we are at some single crossroads; a one-time determining fork in the history of the planet where by our present course of action we can steer it conclusively this way or that, and that we have the wherewithal (or even the necessary information) to understand all the dynamics, all the second, third, fourth ... nth-order consequences to deliver a future appropriate for the organisms we expect to be.   
It is as if MacAskill has got this perfectly backward. He talks about the present as if we are at some single crossroads; a one-time determining fork in the history of the planet where by our present course of action we can steer it conclusively this way or that, and that we have the wherewithal (or even the necessary information) to understand all the dynamics, all the second, third, fourth ... nth-order consequences to deliver a future appropriate for the organisms we expect to be. MacAskill appears under the illusion that [[Butterfly effect|we Amazonian butterflies have the gift to avert future Filipino hurricanes]].   


This is absurd. Literally countless determining forks happen every day, everywhere. Most of them are entirely beyond our control. ''Some'' future is assured. What it is, and who will enjoy it, is literally impossible to know. This [[uncertainty]] is a profoundly important engine of our non-zero-sum existence.  
This is absurd. Literally countless determining forks happen every day, everywhere. Most of them are entirely beyond our control. ''Some'' future is assured. What it is, and who will enjoy it, is literally impossible to know. This [[uncertainty]] is a profoundly important engine of our non-zero-sum existence.  


=== Expected value theory does not help ===
=== Expected value theory does not help ===
MacAskill uses probability theory (again: too many books, not enough common sense) and what financiers might call “linear interpolation” to deduce, from what has already happened in the world, a theory about what will happen, and what we should therefore do to accommodate the forthcoming throng. This is madness.
MacAskill uses [[probability]] theory (again: too many books, not enough common sense) and what financiers might call “linear interpolation” to deduce, from what has already happened in the world, a theory about what will happen, and what we should therefore do to accommodate the forthcoming throng.


[[Probabilities]] are suitable for closed, bounded systems with a ''complete'' set of ''known'' outcomes. The probability when rolling dice is ⅙ because a die has six equal sides, is equally likely to land on any side, must land on one, and no other outcome is possible. This is an artificial, tight, closed system. We can only calculate an expected value ''because'' of this artificially constrained outcome. Probabilities only work for such [[finite game]]s.  
But [[probabilities]] are suitable for closed, bounded systems with a ''complete'' set of ''known'' outcomes. The probability when rolling a die is ⅙ because it has six equal sides, is equally likely to land on any side, must land on one, and no other outcome is possible. This is an artificial, tight, closed system. We can only calculate an expected value ''because'' of this artificially constrained outcome. Probabilities only work for such [[finite game]]s.  


''Almost nothing in everyday life works like that''.<ref>Ironically, not even dice: even a carefully machined die will not have exactly even sides and may fall off the table, or land crookedly, or fracture on landing!</ref>  
''Almost nothing in everyday life works like that''.<ref>Ironically, not even dice: even a carefully machined die will not have exactly even sides and may fall off the table, or land crookedly, or fracture on landing!</ref>  
Line 41: Line 41:
MacAskill came to his thesis courtesy of the thought experiment mentioned above: imagine living the life of every being that has inhabited the planet from Mitochondrial Eve up to the present day. The exercise is meant to illustrate our own personal contingency and microscopic insignificance in the Grand Scheme. There are a paltry eight billion of us; ten times that have gone before, and a thousand times that are — if we don’t bugger everything up — yet to come.  
MacAskill came to his thesis courtesy of the thought experiment mentioned above: imagine living the life of every being that has inhabited the planet from Mitochondrial Eve up to the present day. The exercise is meant to illustrate our own personal contingency and microscopic insignificance in the Grand Scheme. There are a paltry eight billion of us; ten times that have gone before, and a thousand times that are — if we don’t bugger everything up — yet to come.  


This is the human condition: despite our mortal insignificance we are here, they are not. This is MacAskill’s Big Idea: ''we'', lowly ants though we are, are disproportionately empowered to determine the future.
This is the human condition: despite our mortal insignificance we are here, they are not. This is MacAskill’s Big Idea: ''we'', lowly ants though we are, are disproportionately empowered to determine ''their'' future.


The idea chimes for a moment and then falls apart. For it is to see our ''present'' existence as no more than the task of cranking the ''right'' handle on the cosmic machine, to vouchsafe a calculable outcome for someone else. We are but set-builders, moving quietly about a dark theatre. As long as we do as bidden, on time, all will be well and performers will shine. Our role is barely worth a mention in the final credits.  
The idea chimes for a moment and then falls apart. For it is to see our ''present'' existence as no more than the task of cranking the ''right'' handle on the cosmic machine, to vouchsafe a calculable outcome for someone else. We are but set-builders, moving quietly about a dark theatre. As long as we do as bidden, on time, all will be well and performers will shine. Our role is barely worth a mention in the final credits.  
Line 47: Line 47:
But we are not Sisyphus. We have our own [[lived experience]]s to think about. It does not follow, ''[[a priori]]'', that we are bound to practise forbearance for the sake of generations unimagined.  
But we are not Sisyphus. We have our own [[lived experience]]s to think about. It does not follow, ''[[a priori]]'', that we are bound to practise forbearance for the sake of generations unimagined.  


Indeed, ''[[a priori]]'', it presents a [[paradox]]: for every step we take, the future keeps retreating. Who get to be the players to shine upon our set? As each generation rolls around, won’t the dismal calculus that applied to us be just the same for them? Who gets to enjoy all this self-restraint? Isn’t each generation just as relatively unimportant as the last?
Indeed, ''[[a priori]]'', it presents a [[paradox]]: for every step we take, the future keeps retreating. Who get to be the players to shine upon our set? As each generation rolls around, won’t the dismal calculus that applied to us be just the same for them? Who gets to enjoy all this self-restraint? Isn’t each generation, relatively, just as unimportant as the last?


This idea of [[iteration]] should give a clue. The future is not dependent on a single collective decision a generation makes now, but upon an impossibly complex array of micro-decisions, made by individuals and groups, every moment throughout space-time. This is as misconceived as is [[Richard Dawkins]]’ idea that a fielder does, or even ''could'', functionally [[Epistemic priority|calculate differential equations to catch a ball]].  
This idea of [[iteration]] should give a clue: the future does not depend on one collective decision a generation makes now, but upon an impossibly complex array of micro-decisions, made by individuals and groups, every moment throughout [[space-time]].


The thought experiment betrays is an unflinchingly [[deterministic]] world-view: the universe is a clockwork machine to be set and configured. Take readings, perform calculations, twiddle dials, progress to the designated place, hold out your hand at the appointed time and the ball will drop into it.
This is as misconceived as is [[Richard Dawkins]]’ idea that a fielder does, or even ''could'', functionally [[Epistemic priority|calculate differential equations to catch a ball]]. The thought experiment betrays is an unflinchingly [[deterministic]] world-view: the universe is a clockwork machine to be set and configured. Take readings, perform calculations, twiddle dials, progress to the designated place, hold out your hand at the appointed time and the ball will drop into it.  


We don’t image [[Richard Dawkins|Professor Dawkins]] was much good at [[cricket]].  
We don’t image [[Richard Dawkins|Professor Dawkins]] was much good at [[cricket]].  
Line 63: Line 63:
But over millions of years — “the average lifespan of a mammalian species” we are confidently told — the sheer volume of chaotic interactions between the co-[[Evolve|evolving]] organisms, mechanisms, [[Systems theory|systems]] and [[Algorithm|algorithms]] that comprise our [[Complexity|hypercomplex]] ecosystem, mean literally ''anything'' could happen. There are ''squillions'' of possible futures. Each has its own unique set of putative inheritors.  
But over millions of years — “the average lifespan of a mammalian species” we are confidently told — the sheer volume of chaotic interactions between the co-[[Evolve|evolving]] organisms, mechanisms, [[Systems theory|systems]] and [[Algorithm|algorithms]] that comprise our [[Complexity|hypercomplex]] ecosystem, mean literally ''anything'' could happen. There are ''squillions'' of possible futures. Each has its own unique set of putative inheritors.  


How do we know to whom we owe a duty? What would that duty be? How on earth would we frame it? Don’t we owe them ''all'' a duty? Doesn’t action to promote the interests of ''one'' branch consign infinitely more to oblivion?
So how do we know ''to whom'' we owe a duty? What would that duty be? How on earth would we frame it? Don’t we owe them ''all'' a duty? Doesn’t action to promote the interests of ''one'' branch consign infinitely more to oblivion?


In any case, who are we to play such cosmic dice? With what criteria? By reference to whose morality — ours, or theirs? If they are anything like our children, they will be revolted by our values, but we can’t even begin to guess what their values will be. So, an uncomfortable regression, through storeys of turtles and elephants, beckons. This is just the sort of thing ethics professors like, of course.
In any case, who are we to play such cosmic dice? With what criteria? By reference to whose morality — ours, or theirs? If they are anything like our children, they will be revolted by our values, but we can’t even begin to guess what their values will be. So, an uncomfortable regression, through storeys of turtles and elephants, beckons. This is just the sort of thing ethics professors like, of course.
Line 73: Line 73:
And [[There’s the rub|here is the rub]]: like Amazonian [[Butterfly effect|butterflies]] causing typhoons in Manila, ''anything'' and ''everything'' ''anyone'' does infinitesimally and ineffably alters the calculus, re-routing evolutionary design forks and making this outcome or that more likely. Decisions that prefer one outcome surely disfavour an infinity of others.   
And [[There’s the rub|here is the rub]]: like Amazonian [[Butterfly effect|butterflies]] causing typhoons in Manila, ''anything'' and ''everything'' ''anyone'' does infinitesimally and ineffably alters the calculus, re-routing evolutionary design forks and making this outcome or that more likely. Decisions that prefer one outcome surely disfavour an infinity of others.   


If you take causal regularities for granted, all you need to be wise ''in hindsight'' is enough [[data]]. In this story, the [[Causation|causal chain]] behind us is unbroken back to where records begin — the probability of an event happening when it has already happened is ''one hundred percent''; never mind that we’ve had to be quite imaginative in reconstructing it.  
If you take causal regularities for granted, all you need to be wise in hindsight is ''enough [[data]]''. In this story, the [[Causation|causal chain]] behind us is unbroken back to where records begin — the probability of an event happening when it has already happened is ''one hundred percent''; never mind that we’ve had to be quite imaginative in reconstructing it.  


Another thing: does this self-sacrifice for the hereafter apply to non-sapient beasts, fish and fowls, too? Bushes and trees? Invaders from Mars? If not, why not?
Another thing: does this self-sacrifice for the hereafter apply to non-sapient beasts, fish and fowls, too? Bushes and trees? Invaders from Mars? If not, why not?


If present homo sapiens really is such a hopelessly venal case, who is to say it can redeem itself millennia into the future? What makes Macaskill think ''future'' us deserves that chance that ''present'' us is blowing so badly? Perhaps it would be better off for everyone else — especially said saintly beasts, fish fowls, bushes and trees — if we just winked out of existence now?
If present homo sapiens really is such a hopelessly venal case, who is to say it can redeem itself millennia into the future? What makes Macaskill think future us deserves that chance that present us is blowing so badly? Perhaps it would be better off for everyone else — especially said saintly beasts, fish fowls, bushes and trees — if we just winked out of existence now?
===Brainboxes to the rescue===
===Brainboxes to the rescue===
But ultimately it is MacAskill’s sub-Harari, wiser-than-thou, top-down moral counselling that grates: humanity needs to solve the problems of the future centrally, and now (no-one in the past felt the need to solve ''our'' problems: what changed?). This requires brainy thirty-five year-olds from the academy, like MacAskill, to do it. And though the solution might be at the great expense of all you mouth-breathing oxygen wasters out there, it is for the future’s good. So suck it up.
But ultimately it is MacAskill’s sub-Harari, wiser-than-thou, top-down moral counselling that grates: humanity needs to solve the problems of the future centrally, and ''now''.  


We should sacrifice you lot — birds in the hand — for our far-distant descendants — birds in a bush who may or may not be there in a million years.
This requires brainy thirty-five year-olds from the academy, like MacAskill, to do it. And though the solution might be at the great expense of all you mouth-breathing oxygen wasters out there, it is for the future’s good. So suck it up.
 
But no-one in the past felt the need to solve ''our'' problems: what changed?
 
Should we really sacrifice you lot — ugly though you may be, you made it here, so you’re birds in the hand — for our far-distant descendants — birds in a bush who may or may not be there in a million years?


Thanks — but no thanks.  
Thanks — but no thanks.  
Line 89: Line 93:
Elon Musk is a fan. So, to MacAskill’s chagrin, is deluded crypto fantasist [[Sam Bankman-Fried]]. He seems to have “altruistically” given away a large portion of his investors’ money to the cause. I wonder what the expected value of ''that'' outcome was. You shouldn’t judge a book by the company it keeps on bookshelves, but still.
Elon Musk is a fan. So, to MacAskill’s chagrin, is deluded crypto fantasist [[Sam Bankman-Fried]]. He seems to have “altruistically” given away a large portion of his investors’ money to the cause. I wonder what the expected value of ''that'' outcome was. You shouldn’t judge a book by the company it keeps on bookshelves, but still.
===See the Long Now Foundation===
===See the Long Now Foundation===
If you want sensible and thoughtful writing about the planet and its long term future, try [[Stewart Brand]] and Brian Eno and the good folk of the Long Now Foundation. Give this hokum the swerve.
If you want sensible and thoughtful writing about the planet and its long term future, try [[Stewart Brand]], Brian Eno and the good folk of the Long Now Foundation. Give this hokum the swerve.
{{Sa}}
{{Sa}}
*[[The future]]
*[[The future]]