Your AI Rollout Failed. The Technology Wasn't the Problem.
He spent the money. Sent the email. Brought in the vendor. Six months later, nothing had changed.
Image created by ChatGPT Image 1.5
How to Actually Roll Out AI at Your Law Firm (Without the Chaos)
TL;DR: The technology part of AI adoption is easy. The people part is where firms stall out. I’ve watched it happen enough times now to know what works: a five-stage rollout that treats adoption like a real project, not an IT announcement. Here’s the playbook.
I had a conversation last month with a managing partner at a 30-attorney firm. He told me they’d been “doing AI” for six months. So I asked him: how many of your attorneys used it last week?
Long pause. “Maybe three?”
He’d spent real money on licenses. Sent a firm-wide email. Even brought in a vendor for a lunch-and-learn. And six months later, 90% of the firm had never logged in a second time.
That’s not a technology problem. That’s an adoption problem. And honestly, it’s the most common story I hear.
Before Anyone Touches the Tool
I know you want to jump to the part where attorneys start using AI on real work. I get it. But the firms that skip this stage are the same ones calling me six months later asking why nobody’s using the thing.
You need to answer a surprisingly boring question first: what specific tasks are you trying to make faster? And I mean specific. “We want to use AI” is a goal, not a use case. “We want first-draft research memos in an hour instead of four” is a use case. “We want a risk summary auto-generated before a lawyer even opens a new contract” is a use case. The difference matters because it gives you something to actually measure later.
Take a look at your tech stack while you’re at it. What’s your document management system? How do people access files? I worked with one firm that got excited about AI-powered document review and then realized their DMS was so locked down that attorneys couldn’t even copy text out of it easily. You want to find that kind of thing now, not during training.
Then put three to five people in charge of this. A partner with authority, someone who understands your systems (even if that’s the office manager), and a couple of attorneys who are genuinely curious. Not a committee. A team. Committees debate. Teams ship.
Run a Pilot That Actually Means Something
Here’s a pattern I see constantly: firm buys seats, gives everyone a login on the same day, nobody knows what to do with it, adoption dies quietly. It’s the law firm equivalent of buying a Peloton and hanging clothes on it.
Instead, pick one group. Five to ten people, ideally in a practice area with lots of document volume. Contract review, employment policy drafting, client intake memos, something like that. You want tasks where the cost of an AI mistake is low while people are still figuring this out.
And then, this is the part nobody wants to hear, train them for real. Not a 30-minute demo over sandwiches. Actual working sessions where they bring a real contract or a real research question and learn to work with the tool on their own files. Hands on keyboards.
Let it run three to four weeks. But here’s what separates a useful pilot from a waste of time: you have to collect honest feedback. Not “did you enjoy the training?” but “how many hours did this save you on the Alvarez contract review?” and “what did the AI get wrong that you had to fix?” Those details are what make the rest of the rollout work. Without them, you’re just guessing.
Write the Policy (Yes, Really)
Look, I know nobody gets excited about writing an AI use policy. But you’re a law firm. Client confidentiality isn’t optional. And the questions you’ll get from partners, clients, even opposing counsel are coming whether you’re ready or not.
The good news: this doesn’t need to be a 40-page binder. Two to three pages covering three things. What client data can and can’t go into the AI tool, including how you’re thinking about privilege. What human review looks like before any AI output reaches a client or a courtroom (short answer: it’s mandatory, every time, no exceptions). And how you’ll handle disclosure, because more jurisdictions are starting to require it and the trend is only going one direction.
Write something. Publish it internally. Refine it later. A short policy that people actually read beats a long one that lives in a drawer. I’ve seen firms spend four months trying to write the perfect AI policy and by the time they finished, half their attorneys had started using ChatGPT on their personal phones anyway. Don’t let perfect be the enemy of done here.
Training That Doesn’t Feel Like a CLE
Now you go firm-wide. But please, for the love of everything, don’t just send a link to some help docs and call it training.
Think about who’s in your firm. You’ve got partners who’ve practiced the same way for 20 years sitting next to associates who were using AI in law school. Those two groups need different things. The partner needs to see how AI handles a task they personally do every week, in their practice area, with their kind of documents. The associate needs to understand the firm’s expectations around review and quality control. One session doesn’t cover both.
Make the training hands-on. I can’t say this enough. I’ve done hundreds of these sessions now, and the single biggest predictor of whether someone actually adopts the tool is whether they typed a real prompt during training or just watched someone else do it. The gap between those two experiences is enormous.
One more thing on training, and this might be the most important tactical advice in this whole piece: find your champions early. You know who I’m talking about. The two or three people from your pilot who got weirdly good at this and started showing their colleagues tricks in the hallway. Make that official. Give them a title if you want, or just make it known that they’re the people to ask. Peer-to-peer adoption always, always outperforms top-down mandates at law firms. Every single time.
Keep Going (This Is the Hard Part)
Most firms treat the rollout like a project with an end date. Launch day happens, everyone claps, and then nobody thinks about it again. That’s how you end up back where you started.
The firms getting real value from AI are doing a few things differently. They’re running quarterly check-ins to find out what’s working and what’s collecting dust. They’re maintaining a shared prompt library (even a simple Google Doc works) and actually updating it. They’re training new hires on AI during onboarding, not six months later. And when the AI companies release new features, which happens constantly right now, someone at the firm is paying attention and figuring out what matters.
I’ll be honest with you about what’s hard. Senior attorneys are slow to change. Some will never change. The technology itself shifts every few months, so what you train on in March might look different by September. And there’s a real risk of what I call “pilot purgatory,” where the firm runs a successful test, pats itself on the back, and never actually expands to the rest of the firm. I’ve seen it happen more than I’d like.
The antidote isn’t more technology. It’s structure. Someone who owns this permanently. A calendar with recurring training dates on it. And a willingness to keep pushing even after the initial excitement fades.
What to Do This Week
If you’ve read this far and you’re thinking “we need to get moving,” here’s where I’d start:
Name two or three specific tasks where AI could save your attorneys real time.
Pick a pilot group and commit to training them properly, not just giving them logins.
Write a short AI use policy covering confidentiality, human review, and disclosure.
Set a 90-day timeline with one person accountable for keeping it on track.
Start. Imperfect is fine. Stalled is not.
The firms that are going to be in the strongest position a couple of years from now aren’t the ones that picked the best AI model. They’re the ones that figured out how to get their people to actually use it. And that starts with treating the rollout like it matters just as much as the technology.
The managing partner I mentioned at the top? We rebuilt his rollout from scratch using this approach. Eight weeks later, 70% of his firm was using AI weekly. Same tool. Same people. Different process.
That’s almost always the story.
If you read this far, you’re probably not wondering whether AI matters for your firm. You’re wondering why the rollout you already tried didn’t stick.
That’s the conversation I have every day with managing partners and COOs who did the right things in the wrong order, or the right things without the structure to make them last. If that’s where you are, tell me what you’re working through: steve@intelligencebyintent.com. I’ll be straight with you about what’s ready to fix now and what needs more time.
One article, every morning, at smithstephen.com. Written for the people running firms, not the people selling to them. No hype cycles. No breathless product launches. Just what’s working and what isn’t.


