In March 2023, researchers at OpenAI announced that GPT-4 had passed the Uniform Bar Exam. The story immediately made headlines, and while the specific performance numbers from that study have since been challenged by other researchers, the underlying claim has held up: a generative AI model can now clear the bar.
Given that the bar exam represents the first major milestone for every new law school graduate, it’s reasonable to wonder what law school looks like, and what it’s going to look like in the future, now that AI models are reasonably proficient at meeting some of the basic requirements of the profession.
Top law schools have good answers to that question. The American Bar Association's Task Force on Law and Artificial Intelligence found that 55% of law schools now offer classes that incorporate the use of artificial intelligence, and 83% provide other curricular opportunities like clinics where students learn to use AI effectively, and those numbers are a few years old at this point. No longer a novelty in legal education, AI is becoming an important tool to master as part of your training.
The real question in 2026 is how law schools are integrating AI into the work of becoming a lawyer. That’s where strong programs separate themselves from reactive ones. This post covers how legal education is changing, what top programs are doing about it, and what you should look for when you’re choosing a law school that will prepare you for the new legal landscape AI is shaping.
Let’s start with the scale of what’s going on in the profession. The 2025 Clio Legal Trends Report found that 87% of lawyers at large firms and 71% of solo practitioners now use AI in their work. That’s up from just 19% in 2023. Tools like Harvey, Lexis+ AI, and Westlaw Precision with CoCounsel are standard at major firms. New associates are expected to arrive knowing how to use them.
Then consider the ethics pressure. ABA Model Rule 1.1 has long required lawyers to maintain competence in the technology they use, and in July 2024 the ABA formalized what that means for generative AI in Formal Opinion 512. The opinion lays out duties around competent representation, client confidentiality, supervision of AI tools, and reasonable fees for AI-assisted work. Likewise, many state supreme courts have adopted their own AI ethics guidelines for lawyers. That means if you’re a lawyer using AI, you’re responsible for what it produces.
The cost of getting that wrong is already visible. Courts have sanctioned attorneys in hundreds of documented cases involving AI-generated hallucinations in filings: briefs citing cases that don’t exist, opinions that were never written, or statutes that were never passed. Law firms want to hire graduates who know how to steer clear of these kinds of risks.
If the profession expects AI competence, then law schools are now responsible for teaching it. And that’s exactly what’s happening.
Three patterns have emerged across strong programs:
The top law schools are no longer waiting until upper-level electives to introduce AI training. They’re building it into the 1L experience.
At some schools, generative AI instruction is now embedded directly into first-year Legal Research as a yearlong course, with students learning both traditional research methods and the critical skills needed to detect hallucinated content and evaluate AI-generated results. Others have launched required AI literacy modules for all 1Ls, or made an introductory AI course mandatory for every first-year student.
The purpose is to make sure no law graduate walks into a firm without understanding how AI fails, when it works, and what the ethical rules require.
Once the foundation is in place, top programs are layering on specialized courses. One school offers at least 17 courses addressing different aspects of AI. Common additions include advanced legal writing in the age of AI, the regulation of AI, and digital advocacy.
A number of law schools have also partnered with Harvey, a domain-specific legal AI platform used across major law firms, giving students hands-on access to the same tools they’ll be expected to use in practice.
Rather than simply asking law students to resist using AI, strong law schools are teaching students how to build and supervise it.
New AI labs are giving law students the opportunity to create legal AI tools from the ground up. Other programs are weaving AI into existing coursework in subtler ways, like asking students on an IP final to grade AI-generated answers: an exercise in the kind of judgment new lawyers will need when they start to practice.
The law is a language-driven profession, and the most visible impact of generative AI on legal practice is on the written word. Ranked #1 in the nation for legal writing by U.S. News and World Report, Stetson Law is particularly well positioned to teach students what AI can and can’t do on the page and to train them to evaluate the machine-generated text that shows up in briefs, motions, memos, and client communications every day.
At the center of Stetson's work on AI is Professor Kirsten Davis, who leads a group of more than 500 law professors exploring the impact of generative AI on legal writing education and practice. Davis began teaching her Legal Writing with Generative AI course in the Fall of 2023, one of the first courses of its kind anywhere in the country. The course is a sustained examination of how a lawyer's writing process changes when a machine is part of it.
Students start with the ethical constraints that shape AI use in legal practice. From there, Davis walks them through what it actually takes to produce good legal writing with AI assistance: how to prompt a model, what to feed it, how to iterate on what it generates, and how to evaluate the output against the facts, the law, and the audience.
AI doesn’t remove the need for legal expertise. It raises the bar for it.
“You need lawyer intelligence to use artificial intelligence,” Davis says. “You’re going to have to be able to evaluate the content AI puts out, and also add the human value on top of the text generated. As sophisticated as genAI is at text generation, that is what it is. It is a text generation tool. It is not a tool that thinks for you, but it can assist in the process of a lawyer’s thinking and communication.”
That understanding carries through to how Davis has restructured her first-year courses, as well. She’s placed a renewed focus on critical legal reading practices: students can’t evaluate what a large language model produces effectively until they’re equally able to critically evaluate text that a real lawyer would produce as well.
Listen to our interview with Professor Davis on Real Cases: The Stetson Law Podcast.
Davis isn’t working alone. AI integration is supported throughout the law school.
Stetson Law Librarians Kristen Moore and Angelina Vigliotti co-teach Advanced Legal Research: Artificial Intelligence and Legal Research, embedding AI into the research curriculum. Vigliotti recently published “All Roads Lead to Rome: Ethics and AI in Legal Practice” in the Stetson Journal of Advocacy and the Law, an examination of how AI legal research tools intersect with the ABA's professional conduct standards.
Adjunct Professor Kristen Chittenden '10, who serves as Vice President and Deputy General Counsel at Turnitin, teaches a course on how AI affects legal education, drawing directly on her professional work with AI detection and academic integrity.
“We take a forward-looking, holistic approach to advocacy education that integrates emerging technologies, including artificial intelligence, to ensure our students are prepared to thrive in a rapidly evolving legal profession,” said Professor Elizabeth Boals, Director of the Center for Advocacy Education. “We are committed to equipping students with the practical skills and adaptability they need to be practice-ready from day one.”
This work is supported at the top of the institution. Dean D. Benjamin Barros has been engaged in the national conversation on how law schools should respond to generative AI and has been quoted in the ABA Journal on how Stetson is tackling the issue: including how to handle AI use under the honor code without setting traps for students who are still figuring out where the lines are. Stetson is also developing institutional AI policies and faculty training to keep that work grounded as the technology evolves.
If you’re evaluating law schools with AI in mind, a few questions are worth asking on any campus visit or admissions call.
If you’re looking for a clear example of what AI can’t do, look no further than the courtroom.
AI can’t read a jury. It can’t adjust on the fly when a witness’s answer opens a door that wasn’t there five minutes ago. AI can’t counsel a client through a decision that will affect their family for a generation.
It’s real lawyers who have to stand in front of a judge and make an argument that lands, with their reputation and their client's future on the line.
Stetson's #1-ranked trial advocacy program is built to train law students for just that moment. Strong law schools are doubling down on developing their students’ skills in advocacy, judgment, ethics, and client relationships, and here also, Stetson’s in the lead. We have more working courtrooms on campus than any other law school in the country, with students arguing cases in front of real judges and practicing attorneys. Stetson Law students compete on trial, dispute resolution, and moot court teams that have won more than 110 national titles. By the time a Stetson grad walks into a courtroom for the first time as a licensed attorney, they’ve already done the work. Law firms across the country know it, and they hire accordingly.
AI will continue to change how lawyers prepare for trial, but it won’t change who stands up and tries the case.
We strongly recommend you don’t use AI for your personal statement. The statement is a place to tell the admissions team about yourself and display your skill as a writer. Using AI thwarts both these goals, and admissions committees can typically spot it easily.
That said, policies on the use of AI in admissions essays vary widely. Some law schools explicitly prohibit AI-generated application materials. Others allow limited use for brainstorming or grammar checks, but prohibit the use of generative AI for writing. A growing number ask applicants to certify that their work is their own.
For more on this, see our guide to writing a great law school personal statement in the age of AI.
Yes, and it’s one of the most important shifts happening across legal education right now. Law schools are moving toward more proctored in-class writing, oral examinations, step-by-step process assignments, and reflective components that require students to show their thinking. The goal is to ensure graduates leave with the skills the practice of law actually demands.
For the foreseeable future, the bar exam is a closed-book, no-AI test, and that is unlikely to change soon. The exam is designed to confirm that a candidate has internalized the foundational knowledge and analytical ability required to practice law without assistance.
However, the NextGen Universal Bar Exam (UBE) is designed to better reflect what new lawyers actually do in practice, shifting the exam away from pure memorization toward integrated testing of core legal knowledge and foundational lawyering skills like research, writing, client counseling, and negotiation. Florida has adopted the NextGen format beginning with the July 2028 exam, and will administer it alongside a Florida-specific component that covers state law, which means Stetson graduates entering the Florida bar from 2028 onward will take the new exam.
The areas of legal practice seeing the fastest change are:
AI tools can now search large volumes of case law, summarize filings, flag inconsistencies in contracts, and produce initial drafts of memos and briefs in a fraction of the time these tasks used to take. Intellectual property work, e-discovery, and regulatory compliance have been particularly affected, since each involves processing enormous volumes of text where automation offers real gains. Areas that depend on client counseling and courtroom presence—such as trial advocacy, negotiation, complex litigation strategy—have been much less affected, and are likely to remain so.
The next generation of legal hiring will reward lawyers who can demonstrate judgment in addition to technical fluency. AI can’t replicate persuasive oral advocacy, sound ethical judgment, careful client counseling, or the critical skills needed to evaluate the output of any source, human or machine. Strong legal research instincts and foundational knowledge of legal systems matter more than ever.
The honest answer is that nobody fully knows yet, but early signals suggest that entry-level work is changing. Tasks that used to go to first-year associates (document review, cite-checking, pulling together research memos) are increasingly handled by AI under attorney supervision. That shifts what a junior lawyer is expected to do. Firms now want new hires who can supervise AI output, which means law students need to graduate with more mature legal judgment than was expected a decade ago. It is one of the reasons a law school's commitment to real-world skills training matters as much as it does.
Harvey, Westlaw Precision with CoCounsel, and Lexis+ AI are three of the most widely deployed tools at major firms, and each handles different use cases. Harvey assists with contract analysis, due diligence, and litigation work. Westlaw and Lexis both offer AI-powered legal research tools that generate research reports with source-linked citations. Law students don’t need to master every platform, but they should graduate knowing what these tools do, how they fail, and how a lawyer validates their output.
The core obligations haven’t changed. What has changed is how those obligations apply to AI. ABA Formal Opinion 512, issued in July 2024, makes clear that lawyers using generative AI must understand how the tool works, verify its output, protect confidential client information from being used for training, and bill clients fairly for AI-assisted work. Courts have reinforced the supervision obligation repeatedly. When a lawyer signs a brief, they’re responsible for every citation in it, whether the citation was written by a junior associate, a professor of law, or a large language model. In addition, state supreme courts have instituted rules to govern AI use by attorneys, so new lawyers taking the bar need to be attentive to the particular rules in their state.
Where you’re educated shapes the kind of lawyer you become and the kind of reputation you carry with you. Judges, law firms, and institutions across the country recognize that Stetson grads arrive prepared to practice.
At Stetson University College of Law, you’ll find: