Template:M intro systems financialisation: Difference between revisions

Jump to navigation Jump to search
no edit summary
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
No edit summary
Tags: Mobile edit Mobile web edit Advanced mobile edit
 
Line 26: Line 26:


In this way, our own model determines the types of biases we see as much as the data. (We will never know if recreational cricketers, left-handers, introverts, or people who live more than twenty kilometres away are discriminated against because this data is not gathered).  
In this way, our own model determines the types of biases we see as much as the data. (We will never know if recreational cricketers, left-handers, introverts, or people who live more than twenty kilometres away are discriminated against because this data is not gathered).  
====In data we trust====
{{Quote|“In God we trust. All others must bring data. ”}}{{Drop|T|he fatuous “truism”}}, allegedly but not actually coined by [[W. Edwards Deming]] brings this whole thing to a head.
Firstly, as has been observed anon, data tells you a limited, and coloured , ''story'' about the past, not a comprehensive picture of the future.
Secondly, that word: “trust”. It is the ''sine qua non'' of commercial enterprise — of ''society''. This has even been proven out by game theorists, in a limited case, between people who do not know when their next interaction will be, or whether there will be one. This is a dependency on an unknown ''future''. Data plays no part in it.
But part of the assessment you must make in a game is as to the state of your opponent’s mind. You must assess whether she understands the rules, whether she recognises the longer term benefits of repeated cooperation over the short term sugar rush of defection, and whether she believes there will be future interactions. One must, that is, ''trust'' your opponent. That is a delicate assessment. It requires knowledge of history — it might seem that data can help there, but the important knowledge is informal — of the depth of your relationship, your interconnections and mutual dependencies, your shared history, shared values and that ineffable assessment of ''whether this is someone I can trust''.
No large language model can do that. As information technology always does at all points where humans react in ineffably human ways, an algorithm must run some kind of proxy calculation of that assessment. At scale you might make a numerical assessment that, say, seventy percent of transactors will honour their bargain, and elect to go one, but this is gamable.
Computers cannot trust. They don't have


====There are no straight lines in nature====
====There are no straight lines in nature====

Navigation menu