In his year-end report, the chief justice avoided the many elephants in the room—including Supreme Court corruption—to reassure judges that they’re superior to bots.
Every year, Chief Justice John Roberts issues a year-end report about the state of the federal judiciary. Every year, it is inane. It’s less of a “State of the Union”–level address designed to talk about the big issues facing the court or the country than a long-winded holiday card where Roberts tells a hokey story while wearing his latest ugly sweater.
Still, this year, I hoped for more. The Supreme Court is about to decide whether people who try to overthrow the government can run for office again, and whether the rule of law has any meaning when applied to a former president. I didn’t expect Roberts to address those issues directly—the court will rule, soon enough—but given that the court will attempt to issue these rulings when its legitimacy is at an all-time low and the very integrity of the institution has been besmirched by the justices’ own scandalous behavior, I thought something about judicial ethics might come up. After all, in 2023 the court issued its first-ever ethics code, albeit one that was roundly mocked as toothless and insincere. I hoped that Roberts might take this opportunity to address, or at least defend, the integrity of his own branch of government as it stands on the precipice of plunging us into darkness.
It was a foolish hope. The 13-page letter Roberts submitted on December 31 was about as self-aware as Clarence Thomas decanting a “free” bottle of wine. Instead of addressing any of the ethical concerns the public rightly has with Roberts’s bench of lifetime-appointed justices, the chief justice devoted his report to musing about the future of AI and its impact on the federal judiciary.
AI did make a lot of news in legal circles last year. Professors across all disciplines, including law, seem dismayed that AI can write term papers as well as a hungover grad student. One program passed the multistate bar exam. And there was the high-profile embarrassment of former Trump lawyer Michael Cohen, who used an AI program that generated fake cases and citations that Cohen then passed along to his defense lawyer. There are a myriad of legal issues surrounding AI as well, from privacy concerns when using facial recognition technology to copyright concerns when AI scrapes other people’s work, to what rules and protections should be put in place to govern companies that use AI for decision-making purposes.
Roberts, however, was not interested in addressing any of those weighty legal concerns. Instead, he focused on the one thing every old person eventually becomes obsessed with: Will the new technology take my job? That would be the same John Roberts whose court has junked labor rights for the average worker in this country, yet now he is concerned about whether the machines are coming to take jobs in his own cushy, white-collar profession.
I’d argue that AI is particularly threatening to lawyers, because it exposes the law for what it’s always been: old dudes repeating what older, now-dead dudes once said. We’re already at a point where AI can generate a competent legal argument (citations omitted) based on history and precedent for any proposition, and that is largely what lawyers do. Very soon, we’ll be able to plug a set of facts into an application and the AI will be able to spit out what the “right” legal outcome should be, and that is largely what judges do. Before the end of this decade, we will likely see a defendant file an appeal based solely on AI disagreeing with the verdict of a trial judge or jury.
In his year-end report, Roberts says confidently that AI will not replace judges. I mean, that’s what everybody says a generation before the technology is ready to replace them, and it’s what Roberts says here. He writes:
Many professional tennis tournaments, including the US Open, have replaced line judges with optical technology to determine whether 130 mile per hour serves are in or out. These decisions involve precision to the millimeter. And there is no discretion; the ball either did or did not hit the line. By contrast, legal determinations often involve gray areas that still require application of human judgment.
Again, I’m generally unimpressed by professionals who herald the accuracy and efficiency of computers when it involves other people’s jobs yet claim their own contributions are now and forever inimitable by technology. But this use of a sports analogy is particularly grating coming from Roberts, who said during his confirmation process: “I will remember that it’s my job to call balls and strikes, and not to pitch or bat.” Suddenly, the self-styled “umpire” now lauds the human discretion (and error) that can change the very outcome of the game. It won’t be long before Roberts sounds as defensive as Major League Baseball umpire Ángel Hernández when AI shows that one of his rulings is clearly wrong. (For the uninitiated, Hernández is the worst baseball umpire that I am aware of, potentially in all of human history. )
Roberts’s invocation of the existence of “gray areas” as the reason humans are better suited to law than AI feels all the more out of place because everybody paying attention now knows that any contested questions are likely to be resolved in whatever way most benefits Republicans, fundamentalist Christians, or wealthy conservatives. People might prefer that impartial humans make these decisions rather than AI, but does anybody really think the Supreme Court justices are impartial? I’m not against the application of human judgment, I’m against the application of Harlan Crow’s judgment. I’m against the application of Jesus Christ’s judgment, or at least whichever bigoted jerk the conservatives think is speaking for him. Roberts and his conservative brethren are already bots; they’re just bots designed and programmed by Leonard Leo.
I don’t think ChatGPT would make a good Supreme Court justice, but I doubt it would do much worse than our current system, which forces us to live under rules divined by Sam Alito after he processes 15 straight hours of Fox News.
We now confront a second Trump presidency.
There’s not a moment to lose. We must harness our fears, our grief, and yes, our anger, to resist the dangerous policies Donald Trump will unleash on our country. We rededicate ourselves to our role as journalists and writers of principle and conscience.
Today, we also steel ourselves for the fight ahead. It will demand a fearless spirit, an informed mind, wise analysis, and humane resistance. We face the enactment of Project 2025, a far-right supreme court, political authoritarianism, increasing inequality and record homelessness, a looming climate crisis, and conflicts abroad. The Nation will expose and propose, nurture investigative reporting, and stand together as a community to keep hope and possibility alive. The Nation’s work will continue—as it has in good and not-so-good times—to develop alternative ideas and visions, to deepen our mission of truth-telling and deep reporting, and to further solidarity in a nation divided.
Armed with a remarkable 160 years of bold, independent journalism, our mandate today remains the same as when abolitionists first founded The Nation—to uphold the principles of democracy and freedom, serve as a beacon through the darkest days of resistance, and to envision and struggle for a brighter future.
The day is dark, the forces arrayed are tenacious, but as the late Nation editorial board member Toni Morrison wrote “No! This is precisely the time when artists go to work. There is no time for despair, no place for self-pity, no need for silence, no room for fear. We speak, we write, we do language. That is how civilizations heal.”
I urge you to stand with The Nation and donate today.
Onwards,
Katrina vanden Heuvel
Editorial Director and Publisher, The Nation
In a different section of the report, Roberts talks about the “fairness gap,” which is the idea that people perceive human judges as more fair than computers, even when computers get it “right.” To me, it’s a mind-boggling thing to highlight, given that I cannot think of a less fair body of rulers right now than the Supreme Court. These people are about to decide whether Trump can stay on the ballot, and virtually nobody thinks that their decision will be guided solely by a “fair” application of constitutional principles.
If anything, it will be their unfairness that keeps these people in power long after technology renders them obsolete. You could design a simple judicial application (though I’d call it a “lawgorithm” because I’m a dad), ask it “do laws apply to everybody, including presidents,” and it would spit out: “Yes, you fleshy idiots.” Only biased, illogical, corruptible humans could turn that question into a Supreme Court case. Generating preferred Republican political outcomes is the real reason AI won’t be replacing Roberts, or judges like him, any time soon.
Elie MystalTwitterElie Mystal is The Nation’s justice correspondent and the host of its legal podcast, Contempt of Court. He is also an Alfred Knobler Fellow at the Type Media Center. His first book is the New York Times bestseller Allow Me to Retort: A Black Guy’s Guide to the Constitution, published by The New Press. Elie can be followed @ElieNYC.