
2 minute read
ChatGPT: what does accurate AI-powered machine learning mean for lawyers?
from LawNews- Issue 2
The company has hit the headlines due to the release of ChatGPT in November 2022, a conversational chatbot that generates text responding to questions
It’s been a long day. As you are shutting down your computer a client calls and asks for an urgent memo on a complicated legal issue. You sigh deeply as the sun is setting. You would rather be at home, and this will take lots of research and effort.
Advertisement
But instead of starting work on the memo, you use a voice activated assistant to state the question to an interface which spits out a comprehensive answer within seconds.
Science fiction or a coming reality?
This scenario has been teased and talked about in various forms for a while now, often with comments on the demise of lawyers. After all, the client could easily get the answer himself.
I think lawyers will be needed for a long time, but let’s look at the latest developments in this area to get a sense of where we are and where things are headed.
OpenAI is a company which was founded to focus on artificial intelligence with a mission “to ensure that artificial general intelligence benefits all of humanity”. It’s governed by a not-for-profit but also has investors such as Microsoft and is fundraising on a reported valuation of US$29 billion.
The company has hit the headlines due to the release of ChatGPT in November 2022, a conversational chatbot that generates text responding to questions.
The GPT stands for “Generative Pre-trained Transformer” and the reason for the hype is the speed and accuracy of the answers. Also capturing attention is its ability to respond with poetry or song or computer code. Just this week Google has announced its response to ChatGPT, in the form of Bard AI. The pace of adoption is likely to increase as these tools become more available.
So, what does accurate AI-powered machine learning, of which this is just one example, mean for the legal industry?
First let’s address some lazy use of terminology. “AI” is a broad term that needs to be broken down. Artificial Intelligence is often used to describe things which are just complex algorithms rather than actual intelligence and consciousness.
The term “general AI” is used to differentiate those more basic abilities from the AI that can understand things and make its own decisions. AI has not reached that point…yet.
Just a parrot?
Critics seize on this point, noting that these forms of algorithm-driven programs, which have been trained and tweaked with human input, are essentially operating like a parrot, drawing on vast quantities of source material (literally hundreds of billions of words).
Where it gets truly interesting is if the parrot understands what it is saying and not just regurgitating content.
The co-founder of OpenAI puts it best, saying in a tweet, “ChatGPT is incredibly limited, but good enough at some things to create a misleading impression of greatness.”
Since Microsoft has invested in OpenAI it seems likely that the software none of us can easily avoid – Microsoft Word –might soon include features that are based on ChatGPT-type innovations.
For example, what if there was an ability to have assistance with inserting specialist clauses directly into a contract after it considered your context, and had evaluated millions of other clause examples?
Law can be incredibly nuanced, so even then I still see the place for a lawyer to be steering the ship of such drafting, and for a long while to come.