People already have a hard enough time getting help from lawyers. Advocates say AI could change that.
Next month, AI will enter the courtroom, and the US legal system may never be the same.
An artificial intelligence chatbot, technology programmed to respond to questions and hold a conversation, is expected to advise two individuals fighting speeding tickets in courtrooms in undisclosed cities. The two will wear a wireless headphone, which will relay what the judge says to the chatbot being run by DoNotPay, a company that typically helps people fight traffic tickets through the mail. The headphone will then play the chatbot’s suggested responses to the judge’s questions, which the individuals can then choose to repeat in court.
It’s a stunt. But it also has the potential to change how people interact with the law, and to bring many more changes over time. DoNotPay CEO Josh Browder says expensive legal fees have historically kept people from hiring traditional lawyers to fight for them in traffic court, which typically involves fines that can reach into the hundreds of dollars.
So, his team wondered whether an AI chatbot, trained to understand and argue the law, could intervene.
“Most people can’t afford legal representation,” Browder said in an interview. Using the AI in a real court situation “will be a proof of concept for courts to allow technology in the courtroom.”
Regardless of whether Browder is successful — he says he will be — his company’s actions mark the first of what are likely to be many more efforts to bring AI further into our daily lives.
Modern life is already filled with the technology. Some people wake up to a song chosen by AI-powered alarms. Their news feed is often curated by a computer program, too, one that’s taught to pick items they’ll find most interesting or that they’ll be most likely to comment on and share via social media. AI chooses what photos to show us on our phones, it asks us if it should add a meeting to our calendars based on emails we receive, and it reminds us to text a birthday greeting to our loved ones.
But advocates say AI’s ability to sort information, spot patterns and quickly pull up data means that in a short time, it could become a “copilot” for our daily lives. Already, coders on Microsoft-owned GitHub are using AI to help them create apps and solve technical problems. Social media managers are relying on AI to help determine the best time to post a new item. Even we here at CNET are experimenting with whether AI can help write explainer-type stories about the ever-changing world of finance.
So, it can seem like only a matter of time before AI finds its way into research-heavy industries like the law as well. And considering that 80% of low-income Americans don’t have access to legal help, while 40% to 60% of the middle class still struggle to get such assistance, there’s clearly demand. AI could help meet that need, but lawyers shouldn’t feel like new technology is going to take business away from them, says Andrew Perlman, dean of the law school at Suffolk University. It’s simply a matter of scale.
“There is no way that the legal profession is going to be able to deliver all of the legal services that people need,” Perlman said.
Turning to AI
DoNotPay began its latest AI experiment back in 2021 when businesses were given early access to GPT-3, the same AI tool used by the startup OpenAI to create ChatGPT, which went viral for its ability to answer questions, write essays and even create new computer programs. In December, Browder pitched his idea via a tweet: have someone wear an Apple AirPod into traffic court so that the AI could hear what’s happening through the microphone and feed responses through the earbud.
Aside from people jeering him for the stunt, Browder knew he’d have other challenges. Many states and districts limit legal advisors to those who are licensed to practice law, a clear hurdle that UC Irvine School of Law professor Emily Taylor Poppe said may cause trouble for DoNotPay’s AI.
“Because the AI would be providing information in real time, and because it would involve applying relevant law to specific facts, it is hard to see how it could avoid being seen as the provision of legal advice,” Poppe said. Essentially, the AI would be legally considered a lawyer acting without a law license.
AI tools raise privacy concerns too. The computer program technically needs to record audio to interpret what it hears, a move that’s not allowed in many courts. Lawyers are also expected to follow ethics rules that forbid them from sharing confidential information about clients. Can a chatbot, designed to share information, follow the same protocols?
Perlman says many of these concerns can be answered if these tools are created with care. If successful, he argues, these technologies could also help with the mountains of paperwork lawyers encounter on a daily basis.
Ultimately, he argues, chatbots may turn out to be as helpful as Google and other research tools are today, saving lawyers from having to physically wade through law libraries to find information stored on bookshelves.
“Lawyers trying to deliver legal services without technology are going to be inadequate and insufficient to meeting the public’s legal needs,” Perlman said. Ultimately, he believes, AI can do more good than harm.
The two cases DoNotPay participates in will likely impact much of that conversation. Browder declined to say where the proceedings will take place, citing safety concerns.
Neither DoNotPay nor the defendants plan to inform the judges or anyone in court that an AI is being used or that audio is being recorded, a fact that raises ethics concerns. This in itself resulted in pushback on Twitter when Browder asked for traffic ticket volunteers in December. But Browder says the courts that DoNotPay chose are likely to be more lenient if they find out.
The future of law
After these traffic ticket fights, DoNotPay plans to create a video presentation designed to advocate in favor of the technology, ultimately with the goal of changing law and policy to allow AI in courtrooms.
States and legal organizations, meanwhile, are already debating these questions. In 2020, a California task force dedicated to exploring ways to expand access to legal services recommended allowing select unlicensed practitioners to represent clients, among other reforms. The American Bar Association told judges using AI tools to be mindful of biases instilled in the tools themselves. UNESCO, the international organization dedicated to preserving culture, has a free online course covering the basics of what AI can offer legal systems.
For his part, Browder says AI chatbots will become so popular in the next couple of years that the courts will have no choice but to allow them anyway. Perhaps AI tools will have a seat at the table, rather than having to whisper in our ears.
“Six months ago, you couldn’t even imagine that an AI could respond in these detailed ways,” Browder said. “No one has imagined, in any law, what this could be like in real life.”