A federal judge in Texas is requiring lawyers in cases before him to certify that they did not use artificial intelligence technology to draft filings without a human checking their accuracy first.
U.S. District Judge Brantley Starr of the Northern District of Texas issued the requirement on Tuesday.
Starr said in an interview Wednesday that he devised such a requirement to warn lawyers that AI tools can create fake cases and that he may sanction them if they rely on AI-generated information without first verifying it.
“We’re at least putting lawyers on notice, who might not otherwise be on notice, that they can’t just trust those databases. They’ve got to actually verify it themselves through a traditional database,” he said.
MITIGATING ‘RISK OF EXTINCTION FROM AI’ SHOULD BE GLOBAL PRIORITY, INDUSTRY LEADERS SAY
Starr explained that he began drafting the mandate while attending a panel on artificial intelligence at a conference hosted by the 5th Circuit U.S. Court of Appeals.
The judge said he considered banning the use of AI in his courtroom altogether, but had decided not to do so after conversations with UCLA School of Law professor Eugene Volokh.
Starr also noted that he and his staff will avoid using AI in their work altogether.
“I don’t want anyone to think that there’s an algorithm out there that is deciding their case,” Starr said.
GET FOX BUSINESS ON THE GO BY CLICKING HERE
In a notice on the district court’s website, the order said all attorneys must attest that either no portion of the filing was drafted by generative artificial intelligence – like OpenAI’s ChatGPT or Google Bard – or that any language drafted by generative artificial intelligence was checked for accuracy, “using print reporters or traditional legal databases, by a human being.”
The statement said that while such platforms are “incredibly powerful,” platforms in their current states are prone to hallucinations. In addition, reliability and bias are other issues to consider.
CLICK HERE TO READ MORE ON FOX BUSINESS
“Unbound by any sense of duty, honor, or justice, such programs act according to computer code rather than conviction, based on programming rather than principle,” the notice said.
Reuters contributed to this report.