Template:M intro design how the laws of data science lie: Difference between revisions

Jump to navigation Jump to search
No edit summary
No edit summary
Line 18: Line 18:
The [[JC]]’s sense is a similar thing may be true of data science, only it is less benign.  
The [[JC]]’s sense is a similar thing may be true of data science, only it is less benign.  


We tell ourselves that data models can predict our behaviour, are unfailingly accurate, that we should yield to their greater power. We no longer need “thick” human rules of moral principle to moderate our behaviour, because machines can systematically apply infinitesimally thin rules that equably adjudicate on any given particular. This is all the more concerning with the advent of [[neural network]]s and [[large language model]]s that we readily confess we do not understand at all, but we were already there, in our collective obeisance to, for example, the truth of DNA testing, or GPS navigation, or automated self-triage. It ''seems'' plausible; we don’t feel like we have good grounds to challenge it, so we defer to it. We suppose spitting in a tube can tell us with certainty that we are 99.4% Scottish, 0.2% North African with a smudge around Scandinavia, less than 4% Neanderthal, but don’t pick up any African heritage at all— despite the fact that every human on the planet is, ultimately, 100% African by origin (''homo sapiens'' diverged from ''homo neanderthalensis'' hundreds of thousands of years before any human departed Africa).
We tell ourselves that data models can predict our behaviour, are unfailingly accurate, that we should yield to their greater power. We no longer need “thick” human rules of moral principle to moderate our behaviour, because machines can systematically apply infinitesimally thin rules that equably adjudicate on any given particular. This is all the more concerning with the advent of [[neural network]]s and [[large language model]]s that we readily confess we do not understand at all, but we were already there, in our collective obeisance to, for example, the truth of DNA testing, or GPS navigation, or automated self-triage. It ''seems'' plausible; we don’t feel like we have good grounds to challenge it, so we defer to it. We suppose spitting in a tube can tell us with certainty that we are 99.4% Scottish, 0.2% North African with a smudge around Scandinavia, less than 4% Neanderthal, but don’t pick up any African heritage at all — despite the fact that every human on the planet is, ultimately, 100% African by origin (''homo sapiens'' diverged from ''homo neanderthalensis'' hundreds of thousands of years before any human departed Africa).
 
These thin rules ''lie'': they give us a false comfort in the truth of the things they opine about, the same way science does.<ref>{{author|Nancy Cartwright}}, {{br|The Laws of Physics Lie}}.</ref> So there aren’t ''really'' 590 calories in that burger — it seems plausible if it is printed on the menu card, but the more permanently it is printed the less likely it is to be true. There are not really 49.57km in those directions to the airport, the DNA tests really don’t know whether you are partly Bulgarian — but you as a layperson and none the wiser, so the claim can be made and got away with. It's not independently testable.  How would you know?  Your implicit trust in untestable propositions but gets trust, and from nowhere the [[Data modernism|data modernist]]s have bootstrapped themselves into a kind of credibility.