Ready to make AI actually work for your business? Book a Call with Javan
• 4 min read

Stop Building Two Curricula When You Need One

Your AI curriculum shouldn't collapse when the client switches from Microsoft to Google. How a shared-core, swappable-tool-layer architecture future-proofs training design.

A training team I know recently discovered, two days before delivering a workshop, that their client uses Google Workspace instead of Microsoft. The entire programme — demos, videos, hands-on exercises, screenshots — was built around Copilot and Power Automate. Panic ensued. Could the facilitator deliver in Zapier instead? Could they record new demo videos for Gemini by Thursday? Could someone find out exactly which Google tools the learners had access to?

This wasn't a failure of planning. It was a failure of architecture. The curriculum was coupled to a specific tool, and every new client on a different stack would trigger the same fire drill.

The proposed solution? Build Gemini-specific demos alongside the Copilot ones. Maintain parallel content tracks.

That scales terribly. The next client is on Zapier with no AI assistant. The one after that uses Make. Someone else has a Microsoft license but their IT department has blocked Copilot. Every variation means more content to create, more to maintain, and more to break.

The architecture problem

Most AI training curricula are structured around tools. Module 1: Introduction to Copilot. Module 2: Building workflows in Power Automate. Module 3: Advanced features.

This means the tool is load-bearing. Remove it and the curriculum collapses. Every concept is taught through the lens of a specific interface, every exercise assumes a specific platform, every screenshot shows a specific product.

But here's the thing: the valuable content in any good AI curriculum is almost entirely tool-agnostic. Deciding whether a process is worth automating. Mapping a workflow. Evaluating feasibility. Designing for reliability. Testing. Governance. Communicating ROI. None of that changes based on whether the learner is using Power Automate or Zapier or n8n or Make.

Shared core, swappable tool layer

The fix is an architectural decision, not a content decision. Separate the curriculum into two layers:

The shared core is everything that's transferable: frameworks, principles, decision-making, evaluation, communication. This is the majority of the curriculum's value and it doesn't change regardless of platform. Feasibility assessment works the same way whether you're evaluating a Power Automate workflow or a Zapier zap. Governance questions are identical. ROI calculations don't care which tool produced the automation.

The tool layer is the platform-specific content: demo videos, interface walkthroughs, screenshot-heavy guides, tool-specific build exercises. This is a thin layer that sits on top of the shared core. It's important — learners do need to know how to use their actual tools — but it's swappable.

When a new client arrives on Google Workspace, you swap the tool layer. The shared core stays exactly the same. Instead of rebuilding a programme, you're recording a few new demo videos and updating the platform-specific micro build instructions.

The terminology bridge

One practical detail that makes this work: a terminology mapping table. Every automation platform uses different words for the same concepts. A "Zap" in Zapier is a "Flow" in Power Automate is a "Scenario" in Make. A "Filter" in Zapier is a "Condition" in Power Automate. A "Trigger" is universal but the configuration differs.

Include a simple translation table in the tool layer and suddenly learners can follow the shared core's conceptual content regardless of which platform they're using. They understand that a "trigger-filter-action" pattern is the same thing everywhere — only the button labels change.

The workshop becomes platform-agnostic

This architecture has a powerful side effect: your workshops don't need to be platform-specific.

If the workshop is about discussing feasibility trade-offs, diagnosing workflow failures, or presenting a business case to peers — none of that requires everyone to be on the same platform. A Power Automate user and a Zapier user can have a productive debate about whether a workflow is worth automating, because the question has nothing to do with which tool they'd build it in.

The tool-specific building happens in self-study, where learners can follow along with platform-specific videos and guides at their own pace. The synchronous time is for the thinking that's the same regardless of the tool.

Future-proofing

This isn't just about today's tool diversity. It's about tomorrow's. The automation landscape is shifting rapidly. AI-native platforms are emerging that let you describe what you want and the tool builds it for you. Today's manual workflow builders may look very different in eighteen months.

A curriculum coupled to Power Automate is fragile. A curriculum built on transferable thinking with a swappable tool layer is resilient. When the next platform shift happens — and it will — you update the tool layer. The core curriculum, and everything the learners actually needed to learn, remains intact.

Want AI adoption that actually sticks?

Every engagement starts with a conversation — no pitch, no generic playbook. Let's talk about what your team is actually trying to change.

Book a Call with Javan →

Note: This article reflects the author's experience and perspective. For guidance specific to your organisation, book a call.