The Cheapest Training Program You'll Never Budget For
The agencies that are pulling ahead on AI adoption are not the ones with the biggest training budgets.
They're the ones with the strongest habits.
Specifically, one habit. A recurring ten-minute slot where one person demos one AI workflow they built that week. That's it. That's the whole program.
No consultant. No curriculum. No tracking platform. Ten minutes in a Monday standup, or a Friday wrap, or a dedicated weekly ritual — whenever fits how your team already works. The format is always the same: one person, one workflow, what they built and what it does.
And it beats almost every formal training program I've seen. Not because the training programs are bad. Because this ritual does something formal training can't.
What a Demo Ritual Actually Produces
Formal training events produce short-term excitement and medium-term decay. The week after a training, people are energised. A month later, 70% of what was taught is back on the shelf. This is well-documented across corporate learning research, and it's why most agency AI training doesn't compound.
A weekly demo ritual produces three things training events don't.
Visibility. The single biggest reason the middle of your team isn't adopting AI is that they don't see what it makes possible. They've heard the hype, read the headlines, seen a colleague click around in a chat window. They have no visceral sense of what's actually being built. A weekly demo closes that gap. Every week, someone shows a real, working thing.
Social proof. When the team's most respected senior copywriter stands up and says "this is the prompt chain I built this week to speed up creative briefs," the people in the room update what they think AI is for. Not because of the tool. Because of who's showing them. Peer credibility beats leadership broadcasts every time.
Expectation-setting. A ritual that runs every week turns AI usage from a side project into a cultural norm. Not by announcing it. By making building visible as an everyday part of how the team works. Over three months, the message that lands isn't "we want you to do this." It's "this is what we do here."
Why This Works When Formal Training Doesn't
The Spark AI 2026 report surfaces the same principle from a different angle. The agencies making real progress are making AI progress visible — dedicated slots in team meetings, monthly challenges that turn workflow problems into team projects, recognition for people who build tools others adopt.
The underlying insight is about social conditions, not instruction. Formal training is content transfer. This ritual is behaviour change. The two do completely different work.
Content transfer gets you to capability on a specific tool. Behaviour change gets you to a culture where people keep building even when nobody's grading them. You need the second more than you need the first, for the simple reason that the tools keep changing and the behaviours compound.
The Rules That Make It Work
Four rules, based on what's worked across the teams I've trained.
Ten minutes is a hard limit, not a soft one. The moment the demo creeps to fifteen minutes, twenty minutes, half an hour, it stops being a ritual and starts being a mini-training session. People skip it. Participation collapses. Keep it tight. If someone has a more substantial thing to share, they get a second slot the following week.
One person per week, rotating. Every team member presents eventually. Not just the enthusiasts. When the quiet one in the corner demos something they built, the signal to the rest of the team is louder than any leadership announcement could be.
Real work only. No theoretical demos. No "this is what this tool can do in theory." The person presenting has to demo something they actually built and actually use. The demo finishes with "and here's what it saves me" or "and here's where I've deployed this in live work."
Record it. Ten minutes of screen-share is trivial to record. Post it in a shared channel. Now it's a searchable archive — the team's own internal AI training library, built by and for them, updated every week.
The Agencies Doing This Well
Two examples from the Spark report that illustrate this done right.
Not Actual Size, a creative and strategy agency, focused first on creating the right conditions for their team to engage with AI. Psychological safety came before skill development — that order mattered. Open sessions gave individuals the chance to talk honestly about what AI meant for their craft. From that foundation, their team moved quickly into building a proprietary AI copywriting tool for a long-term client.
mark-making*, a B2B brand and creative agency, introduced an internal training programme of experimental challenges — with the goal of embedding AI fully into everyday workflows. Same principle: make building visible, make experimentation normal, let the culture shift produce the capability.
Neither of those agencies did it through a big training event. Both of them did it through consistent, visible, ongoing practice.
What Not to Do
Three ways this ritual goes wrong in practice.
Letting leadership demo too often. When the head of the agency presents every other week, it stops being a team ritual and becomes a leadership broadcast. Rotate widely. The junior showing something they built matters more than the MD doing it.
Critiquing the demo. This is a show-and-tell, not a presentation review. The point isn't to make the work better. The point is to make the building visible. Critique kills participation fast. Save the craft feedback for a different venue.
Skipping when nobody has anything to show. Don't. Protect the slot even when it has to stretch. If nobody on your team built something worth demoing this week, that's a diagnostic signal worth discussing in the meeting itself. "Why did we not build anything this week" is often the most productive ten minutes the ritual produces.
The Real Math
If you run this ritual every week for a year, you've generated 50 demos. Fifty pieces of practical, team-specific, real-work AI capability, built by the people who work at your agency, recorded and searchable.
That's more content than most agency AI training programs produce. And it cost you nothing beyond the ten minutes.
It works because it does something training can't. It makes AI adoption normal.

