Tech
Expert witness used Copilot to make up fake damages, irking judge
“The use of artificial intelligence is a rapidly growing reality across many industries,” Schopf wrote. “The mere fact that artificial intelligence has played a role, which continues to expand in our everyday lives, does not make the results generated by artificial intelligence admissible in Court.”
Ultimately, Schopf found that there was no breach of fiduciary duty, negating the need for Ranson’s Copilot-cribbed testimony on damages in the Bahamas property case. Schopf denied all of the son’s objections in their entirety (as well as any future claims) after calling out Ranson’s misuse of the chatbot at length.
But in his order, the judge suggested that Ranson seemed to get it all wrong before involving the chatbot.
“Whether or not he was retained and/ or qualified as a damages expert in areas other than fiduciary duties, his testimony shows that he admittedly did not perform a full analysis of the problem, utilized an incorrect time period for damages, and failed to consider obvious elements into his calculations, all of which go against the weight and credibility of his opinion,” Schopf wrote.
Schopf noted that the evidence showed that rather than the son losing money from his aunt’s management of the trust—which Ranson’s cited chatbot’s outputs supposedly supported—the sale of the property in 2022 led to “no attributable loss of capital” and “in fact, it generated an overall profit to the Trust.”
Goldman suggested that Ranson did not seemingly spare much effort by employing Copilot in a way that seemed to damage his credibility in court.
“It would not have been difficult for the expert to pull the necessary data directly from primary sources, so the process didn’t even save much time—but that shortcut came at the cost of the expert’s credibility,” Goldman told Ars.