Mar 10, 2026

Why Your AI Training Isn't Sticking (And What Actually Works)

Most AI training teaches concepts. But adoption is a behaviour change problem. Here is what actually moves the needle.

The Real Barrier Is Not What You Think

Most organisations assume that AI adoption is a knowledge problem. People do not know how to use the tools, so you train them. Run some workshops. Share some courses. Wait for the results.

PwC did exactly that. In April 2023, they announced a $1 billion investment over three years to expand and scale their AI capabilities, including upskilling their 75,000 US employees. They launched an initiative called My AI, rolled out internal tools like ChatPwC and Microsoft Copilot, built learning pathways and ran hackathons and game show-style competitions. Voluntarily, 95% of employees engaged with the programme, clocking more than 360,000 hours of AI learning.

And yet usage patterns barely shifted. People were using AI for basic tasks like document summarisation and drafting, but they were not integrating it into the work that actually fills their week. The tools were there. The training was there. Executive buy-in was there. But behaviour had not changed.

PwC's Chief Learning Officer, Leah Houde, identified the problem in an interview with UNLEASH: the cognitive load of trying something new in the middle of doing what you normally do is hard. Your people are not resistant. They are busy. And busy professionals do not experiment on live client work.

That insight changed everything about how PwC approached AI adoption. And it should change how your organisation thinks about it too.

Why AI Training Fails Differently

AI is not like learning a new version of Excel. With Excel, you learn a formula, you apply it, and you get a predictable result. The skill transfers directly. You can practise on a sample spreadsheet and the technique works the same way on a real one.

AI does not work like that. There is no single right answer. Prompts are personal. The same question produces different quality results depending on how you frame it, what context you provide and what you are trying to achieve. Two people in the same role with the same tool will get completely different outcomes depending on how they think about the task.

This creates three problems that traditional training cannot solve:

You cannot practise on live work. When someone is preparing a client report or responding to a board query, they are not going to stop and experiment with a new AI approach. The stakes are too high. The deadline is too close. They default to whatever method they trust, which is the one they have been using for years.

Knowing what AI can do is not the same as knowing where it fits. Most training teaches capabilities. Here is what Copilot can do in Excel. Here is how to summarise a document. Here is how to draft an email. But the gap is not knowledge of features. It is understanding where AI fits in your specific workflow, with your specific data, for your specific tasks. That is something a generic workshop cannot teach.

The tool changes faster than the training. We covered this in detail in our post on why the train and leave approach does not work for AI. Microsoft releases new Copilot features monthly. The underlying models change quarterly. A workshop delivered in January is partially outdated by March. But more fundamentally, the format of traditional training assumes a stable subject. AI is not a stable subject.

What PwC Did Instead

PwC's breakthrough was not better training content. It was a different environment for learning.

In March 2024, they launched what they called prompting parties: low-stakes, collaborative sessions where employees bring real work problems but with zero client deliverables at risk. No performance pressure. No grading. Just a room (virtual or physical) full of colleagues experimenting together.

The format was simple. Teams would gather, share real use cases, watch how more experienced users prompted the tools and then try it themselves with their own work scenarios. The emphasis was on doing, not watching.

The results were immediate. Within the first couple of months, PwC received more than 400 requests for additional sessions. They have since completed over 500 prompting parties, roughly ten per week, with 800 more in the pipeline. One early session featuring a guest speaker drew 22,000 attendees. Participants reported 20-40% efficiency gains after attending.

Crucially, the prompting parties changed what people were using AI for. Before, usage was shallow: summarisation, basic drafting. After, people started using the tools as creative partners, brainstorming collaborators and analytical assistants. As Houde put it, the sessions opened their aperture around what was possible.

The activator model

PwC also built a peer network of around 350 volunteer GenAI Super Users, overseen by more experienced My+ Activators. These were not AI experts. They were enthusiastic peers from across the firm who could translate AI capabilities into their team's specific language and workflows.

This matters more than it sounds. Hearing about AI from leadership bounces off. Hearing about it from the person who does your same job and just cut two hours off their Thursday routine lands differently. Peer translation beats top-down mandates every time.

The parallel to GE's Six Sigma programme decades earlier is striking. That initiative discovered the same thing: certification meant nothing without doing a real project under guidance and presenting results to leadership. The formula is not new. We just keep forgetting it every time a new technology arrives and we default back to the classroom.

The Four Patterns That Actually Work

The PwC story is the most visible example, but the patterns it reveals are consistent across every organisation we have seen succeed with AI adoption. Here is what actually moves the needle:

1. Safe spaces to experiment

People need dedicated time to try AI on real work problems with zero performance pressure. Not a sandbox with dummy data. Real problems from their actual role, but in an environment where getting it wrong is fine.

The format does not have to be a prompting party. It could be a weekly AI hour, a monthly hackathon or an afternoon workshop built around a specific workflow. The critical ingredient is protected time where experimentation is the point, not a distraction from the real job.

This is why our Copilot training sessions are built around practical exercises using real work scenarios rather than generic demos. When someone uses Copilot to solve a problem they actually have, the learning sticks in a way that watching a demonstration never does.

2. Peer translation over top-down mandates

Identify the people in your organisation who are naturally curious about AI. They exist in every team. Give them structure, permission and peers. Let them become the people others come to with questions.

You do not need 350 volunteers like PwC. Even two or three champions per department can create a flywheel effect. The key is that they are practitioners, not IT staff or external consultants. When someone in Finance shows another person in Finance how they use Copilot to analyse WIP data, it lands because the context is shared. We wrote about this with specific examples in our post on how accountants are actually using Microsoft Copilot.

3. Real workflows, not generic demos

The biggest mistake in AI training is teaching features instead of workflows. People do not need to know everything Copilot can do. They need to know how it helps with the specific report they produce every week, the specific email thread they manage every day, the specific meeting prep they do every Monday morning.

When we deliver training, the first thing we do is understand how the team actually works. What tools do they use? What takes up their time? Where are the bottlenecks? Then we build sessions around those real scenarios. A session for an accounting team looks completely different from a session for a marketing team, even though the underlying tool is the same.

Generic AI awareness workshops rarely stick because they answer questions nobody is asking yet. Workflow-specific training answers the question that matters: how does this help me with what I am doing right now?

4. Learn it, see it, try it, do it

Most companies stop at "learn it" and wonder why adoption stalls. They deliver a workshop, tick the box and move on. But learning a concept rarely changes behaviour on its own.

The full cycle looks like this:

  • Learn it: Understand what the tools can do and the fundamentals of prompting
  • See it: Watch someone in your role use it on a problem you recognise
  • Try it: Experiment yourself in a low-stakes environment with real work problems
  • Do it: Apply it in your actual workflow with support available when you get stuck

The "try it" phase is where most organisations have a gap. It is also where behaviour actually changes. PwC's prompting parties were essentially a scaled mechanism for this phase. Without it, you get awareness without adoption.

It Is Not a Training Problem. It Is an Environment Problem.

The most important question for leadership is not "have you trained your team on AI?" It is "have you created the conditions for your team to experiment with AI?"

If your team has Copilot licences but no dedicated time to explore them, you have an environment problem. If they have been through a workshop but nobody in their immediate team is using AI regularly, you have a peer translation problem. If they know what Copilot can do in theory but cannot connect it to their Tuesday afternoon, you have a workflow problem.

None of these are solved by more courses.

PwC learned this the hard way, with a billion-dollar investment and 95% voluntary engagement that still did not move usage patterns until they changed the environment. In February 2026, they went even further, launching a Learning Collective that embeds AI development directly into client work rather than treating it as separate training. Their Chief People Officer said periodic training programmes simply cannot keep pace with how quickly work is changing.

That tracks with everything we see in the organisations we work with. The ones getting the most from AI are not the ones with the most training hours. They are the ones that built experimentation into the rhythm of work.

What This Means for Your Organisation

If you are planning AI training, or have already delivered it and are wondering why adoption is flat, here is what we would suggest:

Start with workflows, not features. Before any training, understand how your team actually works. Where is the time going? What is repetitive? What is frustrating? Build your training around those answers rather than a generic curriculum.

Build in the "try it" phase. Whether you call them prompting parties, AI workshops or hands-on labs, create regular, protected time for people to experiment. Even a monthly 90-minute session makes a meaningful difference.

Find your champions. Look for the people already curious about AI. Give them a bit of structure and let them pull their colleagues forward. Peer influence is the most powerful adoption lever you have.

Keep going after the workshop. AI adoption is not a project with an end date. It is an ongoing capability that develops over time. Our AI implementation partnership model exists because we have seen repeatedly that the organisations getting the most from AI are the ones with sustained support, not one-off sessions.

Measure behaviour, not attendance. The metric that matters is not how many people completed the course. It is how many people are using AI in their actual work a month later. If you are not tracking that, you are measuring the wrong thing.

Training alone does not create adoption. The right environment does. And the good news is that creating that environment is neither expensive nor complicated. It just requires a deliberate decision to make space for it.

See how we approach Copilot training for businesses across every department and workflow.

At IQIT, we build our training around real work scenarios rather than generic AI awareness. Whether your team needs Copilot training that connects to their actual workflows, an AI readiness assessment to understand where to focus, or an ongoing AI implementation partnership to keep momentum going after the workshop ends, we can help. Book a free consultation to talk through what would work for your team.

Ready to get more from Microsoft 365?

Book a free consultation to talk through where you are and where you want to be. No pressure, no hard sell. Just an honest conversation.