Template:M intro technology Better call ChatGPT: Difference between revisions

Line 72: Line 72:
Before long, this process will have itself sedimented into the administrative sludge that weighs your organisation down. Other processes will depend on it. Surgical removal will be ''hard''.
Before long, this process will have itself sedimented into the administrative sludge that weighs your organisation down. Other processes will depend on it. Surgical removal will be ''hard''.
====LLMs and waste====
====LLMs and waste====
{{drop|L|LMs can’t function}} by, or think for, themselves (''yet''). Their deployment implies not saved legal cost, but “[[Seven wastes of negotiation|waste]]” transferred: what once was spent on [[legal eagle]]s will be diffused among [[software-as-a-service]] providers, the firm’s procurement complex, [[internal audit]], [[operations]] and, yes, dear old [[legal]] who will ''still'' have to handle exceptions, manage and troubleshoot the system, vouch for it, be blamed for it, to periodically certify its legal adequacy to CASS compliance and then, when it turns out not to be, explain why it wasn’t to the operational risk [[steerco]].
{{drop|L|LMs can’t function}} by, or think for, themselves (''yet''). Their deployment implies not saved legal cost, but “[[Seven wastes of negotiation|waste]]” transferred: what once was spent fruitlessly on [[legal eagle]]s will instead be diffused among a phalanx of [[software-as-a-service]] providers, procurement personnel, [[internal audit]] boffins, [[operations]] folk and, yes, the dear old [[legal|legal eagles]] who will ''still'' have to handle exceptions, manage and troubleshoot the system, vouch for it, be blamed for it, periodically certify that it is legally adequate to the [[Chief operating officer|COO]] and then, when it turns out not to be, explain why it wasn’t to the operational risk [[steerco]]. All of this costs money, takes time and distracts the firm’s resources from better things they could be doing. Just because it is harder to evaluate, doesn’t mean it isn’t ''there''.<ref>This is wishful thinking, of course: in a world where accounting projections are the first and last word, that ''is'' all that matters.</ref>


LLMs as finite. They necessarily mimic what has gone before. While, yes [[Alpha Go|AlphaGo]] might engineer a novel strategy in a [[zero-sum game]], it is not so easy on in the non-linear infinitude of life. An LLM that purports to improve on its training material will be distrusted it doesn’t understand, so what good had it got of reimagining?
==== The finite game ====
{{Drop|B|y design, LLMs}} learn and reason exclusively from what has gone before. While, yes, [[Alpha Go|AlphaGo]] might have engineered a novel strategy in a [[zero-sum game]], the non-linear infinitude of life that a contract review process is a different kettle of fish. And this is not, in any case, what one wants in a sorcerer’s apprentice.


The perfect LLM serves up an archetypal sample of what you already have.
That being the case the perfect LLM would be one that served up an archetypal sample of ''what you already have''.