NOTCONTENT / training
Back to Blog
Methodology

Stop Teaching AI. Start Building With It.

The biggest mistake in AI training: too many slides, not enough doing. The 'aha moment' doesn't come from a presentation. It comes from building something impossible.

Jeremy Somers
Jeremy SomersFounder, NotContent·Mar 29, 2026·4 min read

The Slide Deck Problem

I sat through an AI training last year as a favor to a friend. Two hours of slides. The history of large language models. A taxonomy of AI tools. A live demo where the trainer asked ChatGPT to write a haiku about productivity. Everyone clapped politely. Nobody learned anything.

This is what passes for AI training at most organizations. A knowledge transfer session. A webinar with a PDF. The assumption is that if you explain AI well enough, people will figure out how to use it. That assumption is wrong.

I've been doing this for three years now — training creative teams, operations teams, leadership teams. The single biggest predictor of whether training sticks? How much time people spend building during the session, not watching.

50% Build Time or Don't Bother

Every NotContent program — whether it's a half-day Foundations workshop or an eight-week Transformation engagement — runs on the same principle: at least 50% of the time is hands-on building. Not watching demos. Not following along. Building.

In Foundations, that means every participant leaves with a configured Claude project for their actual workflow. Not a hypothetical. Not an exercise. Their real work, with their real constraints, running through a system they built themselves.

In Transformation, teams map their actual workflows and automate multiple of them across the eight-week program. Real automations. Connected to their real tools. Producing real output they can use the next morning.

The theory matters — you need to understand why system prompts work, how context engineering shapes output, when to use projects versus styles. But the theory only lands when you're applying it to something you care about. Otherwise it's just information, and information without application has a half-life of about 72 hours.

The Impossible Task

Here's a technique I stole from my agency days and adapted for AI training: give people a task that's impossible without AI.

Not difficult. Impossible. As in, there is no human-powered way to accomplish this in the time allotted.

Produce a complete competitive analysis across 15 brands with strategic recommendations — in 45 minutes. Build a full content calendar for Q3 with copy, visual direction, and channel strategy — in an hour. Create an internal training document that synthesizes 200 pages of brand guidelines into actionable creative briefs — in 30 minutes.

When the task is impossible without AI, people stop dabbling and start depending. They can't copy-paste their way through it. They have to figure out how to actually leverage the tool — how to give it context, how to structure the workflow, how to evaluate and refine the output.

The "aha moment" lives in that gap between "there's no way" and "wait, I just did it." You can't create that moment with a slide deck.

You Don't Need to Be a Programmer

One of the most important things I've learned from training non-technical teams: the business leaders, not the engineers, find the most creative use cases for AI.

Engineers tend to think about AI as a coding tool. It's useful for that, sure. But the operations lead who realizes she can automate her entire weekly reporting pipeline? The sales director who builds a Claude project that prepares him for every client meeting with a synthesized brief? The HR manager who maps every onboarding workflow and eliminates 15 hours of repetitive documentation per new hire?

Those aren't engineering solutions. Those are people who understand their work deeply and learned to express that understanding to an AI system. That's context engineering — the most valuable AI skill that nobody teaches.

The idea that AI adoption requires technical skill is one of the most damaging myths in enterprise right now. It doesn't. It requires intention. It requires understanding your own workflows well enough to explain them. And it requires practice — the kind you only get by building.

Diverge, Converge, Systemize

Our methodology has three phases, and they map directly to how people actually learn:

Diverge. Explore. Try everything. Use AI for tasks you've never considered. This is the phase where people discover what's possible. It's messy, it's experimental, and it's where the most creative insights happen. Most AI training stops here.

Converge. Take what worked and refine it. Build repeatable workflows. Create system prompts that encode your methodology. Configure projects with the right context pre-loaded. This is the phase where experiments become tools.

Systemize. Deploy what you've built. Connect it to your actual tools. Automate the workflows that your team runs repeatedly. This is the phase where individual capability becomes organizational infrastructure.

Each phase requires building. Not reading about building. Not watching someone else build. Hands on keyboard, solving your own problems, with guidance when you get stuck.

The Training Test

If you're evaluating AI training for your team, ask one question: what will my team have built by the end?

If the answer is "a better understanding of AI" — that's a webinar. If the answer is "three automated workflows connected to our actual tools" — that's training.

One changes how people think. The other changes how they work. Only one of them lasts.

Jeremy Somers

Jeremy Somers

Founder, NotContent

15 years as a creative director (Spotify, Nike, Pepsi, Samsung, Mercedes-Benz). Built the first AI-assisted creative agency in 2022.

See where your team stands

Take the 2-minute Readiness Scorecard and get a personalized program recommendation.

Take the Readiness Scorecard →