Template:M intro work Large Learning Model: Difference between revisions

no edit summary
Tags: Mobile edit Mobile web edit
No edit summary
Line 176: Line 176:
The traditional legal model faces existential challenges for sure, but they are not presented, and will not be addressed by random word generators.
The traditional legal model faces existential challenges for sure, but they are not presented, and will not be addressed by random word generators.


==== Coda: is ChatGPT getting worse? ====
==== False memory syndrome ====
Large learning models already can execute this “research assistant” mode. It seems less flashy but — if we can trust it — this is a compelling use case that will remain once our inflated expectations has collapsed into the trough of despair. We could ask Bing AI to track down a long-lost childhood story.{{Quote|1=Bing, do you know a children’s story involving a fox or a wolf or something being chased by a pack of hunting dogs and it hides under a rock. The hiding place is too small, and it has to poke one part of its body out. It has an argument with each of its body parts to see which is the most deserving of being saved and concludes its tail is the least deserving, so it sticks its tail out. The hunting dogs arrive and grab it by the tail and catch it. Moral of the story is something like you are only as good as your weakest link.}}At first, Bing shrugged its shoulders and said, “sorry”. When we persevere with a few more details, success!{{Quote|1=I found the story you were looking for! It’s called “The Wolf and His Tail”. In this story, a wolf is being chased by a pack of hunting dogs and hides under a rock, The hiding place is too small, and it has to poke one part of its body out. It has an argument with each of its body parts to see which is the most deserving of being saved and concludes that its tail is the least deserving so it sticks its tail out. The hunting dogs arrive and grab it by the tail and catch it. The moral of the story is that you are only as good as your weakest link.}}Magnificent! We can ask who wrote the story:{{Quote|“The Wolf and His Tail” is a children’s story that was written by Barbara Juster Esbensen. It was first published in 1972.}}There is only one problem: ''this is not true''. Barbara Juster Esbensen was indeed a children’s author, but she wrote no such book. It is as if Bing has been badgered into false memory syndrome.
 
Perhaps we should see what Google’s Bard has to say. We can input the same prompt. This is the result:{{Quote|The story you’re describing sounds like the Aesop fable “The Fox and the Hunter.” In this fable, a fox is being chased by a pack of hunters. The fox finds a narrow crevice in a rock and hides inside. However, the crevice is too small, and the fox’s tail is sticking out. The hunters arrive and see the fox’s tail, and they grab it and pull the fox out of the crevice. The fox is then killed by the hunters.
 
The moral of the story is that a chain is only as strong as its weakest link. This means that if any part of a system is weak, the entire system can fail. The fox’s tail was its weakest link, and as a result, the fox was caught by the hunters.}}We can be a bit embarrassed to have forgotten about Aesop, until we go looking for the Aesop’s fable and discover that ''it doesn’t exist either''. Bard just made this up.
 
==== Whatever the use caseCoda: is ChatGPT getting worse? ====


In other news, scientists are concerned that [[ChatGPT]] might be getting worse. Studies indicate that its accuracy at tasks requiring computational accuracy, like playing noughts and crosses or calculating prime numbers, is rapidly diminishing.
In other news, scientists are concerned that [[ChatGPT]] might be getting worse. Studies indicate that its accuracy at tasks requiring computational accuracy, like playing noughts and crosses or calculating prime numbers, is rapidly diminishing.


Perhaps [[ChatGPT]] is getting bored, or might it have something to do with the corpus increasingly comprising nonsense text generated on the hoof by some random using ChatGPT?
Perhaps [[ChatGPT]] is getting bored, or might it have something to do with the corpus increasingly comprising nonsense text generated on the hoof by some random using ChatGPT?