The first judge made by artificial intelligence is quite impartial. Bad (and good) news for justice

  • 2

Ruth Bader Ginsburg was one of the most recognized justices of the Supreme Court of the United States, with more than 27 years of experience. Now all that work has been condensed into an artificial intelligence. an Israeli company, AI21 Labshas created an AI based on the answers of Judge Ginsburg to offer a chat that understands questions from the legal field and answers if it is right or wrong.


Answers from a Supreme Court judge in a few seconds. To test the AI ​​we can access the dedicated website of the Jurassic-1 program, the experiment where the opinions, interviews and sentences of the judge have been reviewed. In total, more than 600,000 words from the legal field have been used to train this AI. All of them come from one of the most renowned judges, who passed away a couple of years ago.

In a box, we can write anything to the AI, as long as it’s in English and can be answered with yes, no, or maybe. In addition to choosing one of the three options, the AI ​​will slightly argue the answer. We have asked him about privacy issues, labor rights and murders and the answers have seemed quite logical to us.

Speaking like a judge does not mean thinking like them. As it happens with the different developments of artificial intelligence, no matter how adequate the answers seem, they will still be far from being equal to the reflections of the original judge, in this case. Emily Bender, professor at the University of Washington Explain that “it can return words and the style of them will be based on the text they entered, but it’s not reasoning”.

What is the ELIZA effect, or why are we so surprised to read an article "written" by an artificial intelligence like GPT-3

The AI ​​has already been implanted in Justice, but for the moment as a support. This tool based on Judge Ginsburg is striking and is one of the most modern works, but the use of algorithms and AI in Justice is not new. The administration of Justice has been using algorithms for a long time to calculate the probability of recidivism or to retrieve information from large databases, but it has stopped there.

Is artificial intelligence unbiased? “To the extent that an algorithm does not itself have any element of subjectivity, but is limited to executing a series of orders, the possibility of interference by prejudices or beliefs that could alter the mechanical application of the rules is eliminated”, explains David Martinezprofessor of law at the UOC.

Robots are postulated as the solution to the difference in criteria of the judges. Unaffected by emotions, algorithms promise greater objectivity. However, as has already been shown on numerous occasions, they are not free from bias either. Faced with these algorithms, the Spanish Agency for the Supervision of Artificial Intelligence has been created in Spain.

“An algorithm is not capable of detecting the reasons why human behaviors occur”, exposes to RTVE the Galician magistrate, Luis Villares. This lack of ability to understand emotions also plays a negative role when establishing a sentence.

I teach ethics in artificial intelligence and this is what I teach future engineers

Justice is as imperfect as the human being. “The AI ​​gives back hours of work to the magistrates and managers that can be invested in other tasks, such as evaluating the evidence more thoroughly,” exposes Antonio del Moral, magistrate of the Supreme Court. However, he believes that “AI cannot replace judges, Justice, by definition, is human and imperfect, which we assume. Judicial reasoning cannot be put into standardized molds because each citizen deserves a personal solution” .

In Xataka | I research robotics and artificial intelligence and so I try not to use it for evil

Ruth Bader Ginsburg was one of the most recognized justices of the Supreme Court of the United States, with more…

Ruth Bader Ginsburg was one of the most recognized justices of the Supreme Court of the United States, with more…

Leave a Reply

Your email address will not be published.