Slides & Decks pack
Claude Skill
Training Deck Builder
Builds an internal training or onboarding deck with learning objectives, examples, exercises, and a quiz.
What it does
Given the topic, audience, and time budget, produces a training deck outline + python-pptx scaffolding structured around real learning. Forces explicit learning objectives, builds in worked examples, includes hands-on exercises, and ends with a quiz that tests retention. Designed for the version of training where people actually learn the thing.
When to use
- ✓New-hire onboarding deck on a specific system, process, or domain
- ✓Cross-team enablement (e.g., product team training sales on a new feature)
- ✓Replacing a 60-slide lecture deck with something that produces actual learning
When not to use
- ✗Quick announcement or update — overkill
- ✗You don't actually know the topic well — training requires expertise
- ✗Audience already knows it — you're wasting their time
Install
Download the .zip, then unzip into your Claude skills folder.
mkdir -p ~/.claude/skills
unzip ~/Downloads/training-deck-builder.zip -d ~/.claude/skills/
# Restart Claude Code session.
# Skill is now available — Claude will use it when relevant.SKILL.md
SKILL.md
---
name: training-deck-builder
description: Use when building an internal training, onboarding, or enablement deck. Triggers on "training deck", "onboarding deck", "enablement deck", "training session", "training material".
---
# Training Deck Builder
Most training decks fail because they're lectures. The presenter talks for 50 minutes, the audience zones out, and a week later nobody can do the thing they were "trained" on. A real training deck is built around what learners do, not what the trainer says.
## Required inputs
1. **Topic** — what's being taught, specifically
2. **Audience** — who, how many, what they already know
3. **Time budget** — total minutes (training >90 min should be split)
4. **End-state** — what learners can DO after this session that they couldn't before
5. **Format** — live in-room, live remote, async/recorded
If end-state is vague ("understand X"), push back: "What would they be able to do? 'Deploy a service to staging' is testable. 'Understand deployment' is not."
## Slide-by-slide structure (15-25 slides for ~60-min session)
- **Slide 1 — Title**: Topic, audience, duration.
- **Slide 2 — Why this matters**: The specific moment in their work where this knowledge unblocks them. Not generic.
- **Slide 3 — Learning objectives**: 3-5 things they will be able to DO by the end. Verbs: "deploy", "diagnose", "explain to a customer", not "understand."
- **Slide 4 — Pre-check**: Quick questions to surface what they already know. Calibrates pace.
- **Slide 5-7 — Mental model**: The 1-2 concepts they need before the mechanics. Diagram beats prose.
- **Slide 8-10 — Worked example #1**: Walk through a concrete instance step-by-step. Their world, their data.
- **Slide 11 — Exercise #1**: Hands-on. They do it, you watch. Specific success criteria.
- **Slide 12-14 — Worked example #2 (more complex)**: Build on the first. Add the wrinkle that comes up in real work.
- **Slide 15 — Exercise #2**: Harder. Closer to real conditions.
- **Slide 16-17 — Common pitfalls**: The 3-5 mistakes new people make. Why they happen. How to spot them.
- **Slide 18 — Reference card**: One slide they'd actually save. Cheatsheet they can pin to their wall.
- **Slide 19 — Quiz**: 5-8 questions testing the learning objectives. Multiple choice + 1-2 short answer.
- **Slide 20 — Where to go next**: Docs, channels, who to ask. Specific names + links.
- **Slide 21 — Q&A**: Open floor.
## Narrative principles
- **Objectives are verbs, testable.** "Can configure X without help" beats "knows about X."
- **Show then do, don't tell then test.** Worked examples come before exercises.
- **Build complexity, don't dump it.** First example simple, second example real-world.
- **Cheatsheet beats memorization.** Real work happens with reference materials open.
## Exercise design
For every exercise, specify:
- **What they do** — concrete task
- **What success looks like** — observable
- **How long** — bounded
- **What to do if stuck** — peer help, ask trainer, skip and come back
If a 60-min session has zero exercises, it's a lecture. Push back.
## Quiz design
5-8 questions covering each learning objective. Mix of:
- Multiple choice (fast, easy to grade)
- Short answer / "explain to a teammate" (forces real understanding)
- One scenario question ("here's a situation, what would you do?")
The quiz isn't gatekeeping — it's a check on whether the training worked. If most people miss question 4, the section on objective 3 needs work.
## Anti-patterns (strip these)
- 50 slides of lecture for a 60-min session — too dense, too passive
- "Any questions?" as the only interactivity — that's not engagement
- Vague objectives ("understand", "know about") — untestable
- Skipping the worked example to "save time" — the example IS the training
- One generic example for a diverse audience — make it their world
## Output
1. The slide-by-slide outline as markdown
2. A python-pptx script generating the .pptx in 16:9 with section dividers, exercise slides marked clearly, and a printable cheatsheet slide
3. A separate exercise handout doc (markdown) trainees can work from
4. A quiz doc with answer key for the trainer
5. A "what to send 24h before" pre-read so the live session can move faster
Trainer refines in PowerPoint or Google Slides. python-pptx output is the structured starting point.
Example prompts
Once installed, try these prompts in Claude:
- Training deck for new engineers on our deployment system. 1 hour, 8 attendees, mix of senior and junior. They need to be able to ship to staging by end of session.
- Build an enablement deck training the GTM team on our new pricing model. 45 minutes, 30 reps.
Related prompts
Don't want to install a skill? These prompts in /prompts cover similar ground for one-shot use: