Template:M intro work Large Learning Model: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 69: Line 69:
This “positivism-through-fear” extends with equal force to established market precedents. It doesn’t matter how manifestly unfit for purpose it is, the resistance to change will be strong.
This “positivism-through-fear” extends with equal force to established market precedents. It doesn’t matter how manifestly unfit for purpose it is, the resistance to change will be strong.


====Literary theory, legal construction and LLMs====
Eliza
Like all good conjuring tricks, [[generative AI]] relies on misdirection: in fact, it lets us misdirect ourselves, into wilfully suspending disbelief and therefore not noticing who is doing the “heavy lifting” to turn its output into magic: ''we are''.
The irony is that we are [[Neuro-linguistic programming|neuro-linguistically programming]] ''ourselves'' to be wowed by LLMs. By inputting prompts we create our own expectation of what we want to see, and when the pattern-matching matching produces something like it, we use our imaginations to frame and filter that output as closely as we can to our own instructions.
This is why the novelty wears off: as we persevere, we begin to see the magician’s wires. There are familiar tropes; we see how the model goes about what it does. It becomes progressively less surprising, and eventually settles into an entropic quotidian. Generating targeted specific work product iteratively through a random word generator becomes increasingly time-consuming.
also why when we are targeting a specific outcome, it is frequently frustratingly not quite what we had in mind
We refine and elaborate our query; we learn how to better engineer prompts, and this becomes the skill, rather than the process of pattern matching to respond to it. so ours is the skill going in, and ours is the skill narratising the output. That is true of all literary discourse: just as much of the “world-creation” goes on between the page and the reader as it does between writer and text.
This is most obvious in AI-generated art
====Meet the new boss —====
====Meet the new boss —====
We don’t doubt that LLM is coming, nor that the legal industry will find a use for it: just that there is a ''useful'', sustained use for it. It feels more like a parlour trick: surprising at first, diverting after a while, but then the novelty wears off, and the appeal of persevering with what is basically a gabby but unfocussed child wears pales.
We don’t doubt that LLM is coming, nor that the legal industry will find a use for it: just that there is a ''useful'', sustained use for it. It feels more like a parlour trick: surprising at first, diverting after a while, but then the novelty wears off, and the appeal of persevering with what is basically a gabby but unfocussed child wears pales.