When I was in middle school, I took shop class—the old‑school kind with a factory‑floor layout, real machines, and the faint, ever‑present possibility that someone would sand off a fingertip.
We weren’t making birdhouses or napkin holders. No, our big project was a wooden wall accent piece: a shaped backplate, a little mirror inset, and a small shelf for a candle or a photo of your dog. It was the kind of thing your mom would pretend to like.
The shop was run like a miniature production line. We learned each machine, rotated through stations, and every day someone was designated the “supervisor.”
When it was my turn to supervise, I took the role seriously. I had a simple philosophy: if the piece is substandard, it doesn’t count. A crooked shelf? Doesn’t count. A mirror that doesn’t sit flush? Doesn’t count. A backplate shaped like a potato because you rushed the bandsaw? Definitely doesn’t count.
One day, someone tried to pass through a piece that was visually a turd. I knew exactly which machine had produced the flaw, which meant I knew exactly who had produced the flaw. I walked over, held up the offending object, and said something along the lines of, “You know this isn’t acceptable, right?”
He did not appreciate my commitment to quality control.
His mentality was simple: speed and volume. Get it done. Move on. Quantity over craftsmanship. If it vaguely resembled the assignment, that was good enough.
Mine was the opposite: if it’s substandard, you did zero work.
I didn’t realize it at the time, but that little factory floor was my first exposure to a truth that has followed me through law school, litigation, and now the era of AI‑everywhere:
There is no shortcut for actually understanding what you’re doing.
Or as ethics attorney Andrew Wale told NPR in a recent piece on AI misuse in the courts: “Whatever the generative AI tool gives you — as in, ‘Look at these cases’ — you, under the rules of professional conduct, you have to read those cases. You have to read the cases to make sure what you are citing is accurate.”
The Legal Profession Has Its Own “Crooked Shelf” Problem
Early in my career, a CLE presenter said something that stuck with me: He loved when opposing counsel’s brief was clearly copied and pasted from West headnotes. It meant they hadn’t read the cases. They didn’t know the posture, the nuance, the limiting language, or the actual holding. They were turning in the legal equivalent of a wobbly candle shelf.
And he was right. You can spot headnote‑lawyering instantly. It has a smell.
Fast‑forward to today, and we’re seeing the same thing—just with shinier tools. A lot of lawyers are now relying on AI the way lazy lawyers relied on West headnotes: as a substitute for reading the actual cases.
They think they’re being more productive because they’re producing more volume: More briefs. More filings. More “analysis.”
But it’s the same shop‑class mistake: confusing motion with progress.
If the underlying reasoning is shallow, if the citations don’t stand for the proposition, if the argument is structurally unsound, then all that volume is just a bigger, faster, more polished crooked shelf.
AI Isn’t the Problem — It’s the Mentality
AI is a tool. A powerful one. I use it. I like it. It helps with scaffolding, brainstorming, and drafting. But it can’t replace the part that matters: judgment.
It can’t read the case for you. It can’t understand nuance. And it definitely can’t stop you from filing a beautifully formatted, confidently written, completely wrong argument.
The Shop Class Lesson That Still Applies
Middle‑school me didn’t have the vocabulary for it, but I understood something important: Craftsmanship matters. Understanding matters. Quality matters.
You can’t sand your way out of a bad cut. You can’t glue your way out of a crooked shelf. And you can’t AI‑generate your way out of not reading the cases.
The tools have changed. The stakes have changed. But the principle hasn’t.
Euclid told King Ptolemy there was no royal road to geometry. There’s no royal road to legal work either.
If it’s substandard, it doesn’t count.