Security training has a reputation for being dull. That is not a law of nature but a consequence of choices: modules too long, content too generic, tone too dry, no link to the employee's reality. Change those choices and you measurably see better completion, higher retention and ultimately safer behaviour. How do you build training that employees not only complete but actually get something from?
Why training becomes dull, and why that is a choice
Security training rarely becomes dull because of the topic; it becomes dull because of how it is built. The classic shape: one long module of forty minutes, the same for everyone, in a formal tone, ending with a multiple-choice quiz people click through on autopilot. Result: a tick, not a behaviour.
What causes this is not unwillingness but habit. Many awareness programmes started around a compliance requirement, and that requirement asks for completion, not effect. Ticks are easy to count; behaviour is harder to measure. If you steer only on completion, you get modules optimised for completion, not for learning.
Anyone wanting to do this differently starts with a simple shift: design for retention, not completion. A six-minute module that still lands three months later beats a forty-minute module forgotten next week. That shift touches every subsequent design decision: form, length, tone, audience.
Five ingredients of engaging training
Much more achievable than often thought. Five ingredients do most of the work:
- Short modules. Four to eight minutes. People remember more from six short modules than from one long one. Microlearning is not a fad, it is how memory works.
- Story over bullet list. A module that opens with a concrete example ("a supplier calls to confirm a payment, and this happened at a peer organisation last month") lands differently than one that opens with learning objectives. People remember stories, not bullets.
- Role-based content. Finance gets a different example than IT, IT another than HR, HR another than operations. Generic "for everyone" content avoids upsetting anyone but reaches no-one either.
- Personally applicable examples. A module showing how to make your own Netflix, banking or WhatsApp safer is followed voluntarily. The knowledge travels to work automatically.
- A recognisable face. A visual identity or mascot gives the programme a face. Employees recognise it, refer to it, and associate security with something friendly rather than threatening. A Dutch example is Victor Veiligeit, a goat as the mascot of an awareness programme.
Gamification: when it works and when it backfires
Gamification can engage, but it does not always work and can even backfire. What does work: short challenges, badges for recurring good behaviour (such as reporting phishing), and team competitions where teams encourage each other to report more. What does not: a leaderboard of individual clicks (that is not a game, it is public shaming), or cosmetic points disconnected from real behaviour.
The rule is simple: gamification works for positive behaviour you want to see (reporting, noticing, being an ambassador). The moment you point it at negative behaviour (clicks, mistakes) you damage the reporting culture. A game that shows mistakes makes people hide them.
Test before you scale. What works in one organisation falls flat in another, and even within one organisation departments react differently. Pilot first, learn what works, scale only then.
Storytelling, ambassadors and familiar examples
People have learned from stories for thousands of years, and awareness is no exception. An anonymised story from your own organisation ('last month a phishing email arrived that looked exactly like our year-end close; these are the three signs Anne spotted') sticks longer than a generic explanation of phishing traits.
Use ambassadors from inside the organisation. A short video of a colleague describing how they reported a real attack lands stronger than a module with actors. Stories from your own people are credible, recognisable and free to produce.
Vary modes. A combination of short modules, a poster in the corridor, a physical quiz by the coffee machine and a Friday afternoon email covering one topic briefly works better than a single channel. Hybrid works: variation keeps the programme fresh and gives every employee a form that fits them.
What not to do
As important as what works is what demonstrably does not and therefore belongs on a "do not" list:
- One long annual training. Forty-minute modules get postponed and autopiloted. Break them up into microlearning blocks.
- Mandatory retraining after a click. Clicking on a simulation is a learning moment, not a punishment. Retraining as sanction undermines the reporting culture faster than any attack.
- Generic 'for everyone' modules. Efficient to produce, but reach no-one truly. Make role-specific variants where possible.
- Sanctions for non-completion without context. A new joiner who misses a module in their first week is not a violator. Build in reasonableness.
- Content older than a year. Attack techniques change fast. A 2023 module still teaching 'mistakes in an email are a sign of phishing' teaches the wrong pattern: AI phishing makes no mistakes. Refresh at least every six months.
How to measure whether it is working
Three numbers tell you whether your training is truly engaging. Completion rate speaks to accessibility; voluntary follow-up modules speak to enthusiasm; report rate in phishing simulations speaks to what sticks.
Add two soft signals: feedback from short post-module surveys ('was this useful?'), and the number of colleagues signing up as ambassadors. A rising number of ambassadors is a strong sign the programme is resonating.
Make these numbers visible to the programme team and to the board. Not as KPI cosmetics but as steering input. A programme that honestly looks at itself every quarter and adjusts stays fresh. A programme that just ticks itself off inevitably falls asleep.
See how 2LRN4 turns this topic into a workable programme with training, phishing simulation and management reporting.
View the training pageRelated articles
- When gamification works in awareness
- Microlearning for employees with limited time
- How to build a security culture
- Getting employees to take security training
Sources
FAQ
How do I make security training less dull?
Start with short modules of four to eight minutes instead of one long one. Use stories and concrete examples instead of bullet lists. Make content role-based and personally applicable. Give the programme a recognisable face or mascot. Vary the channels.
Does gamification work in security training?
Sometimes, not always. It works for positive behaviour (reporting, noticing, being an ambassador). It backfires when used for negative behaviour (making clicks visible). Test in a small pilot before rolling out broadly.
How long should a module last?
Four to eight minutes. Beyond ten minutes, completion and retention demonstrably drop. Microlearning works better than one long session, even when total time is the same.
Do stories really help?
Yes. People remember stories far better than bullet lists. An anonymised example from your own organisation sticks longer than a generic explanation. Use real examples where it is safe to do so.
Should I use a mascot?
Not required, but effective. A recognisable visual identity gives the programme a face and makes references easier. It makes security less threatening and more approachable. A Dutch example is Victor Veiligeit, a goat as awareness mascot.
What is the most important thing not to do?
Never punish a click in a phishing simulation with mandatory retraining or visibility to a manager. It undermines reporting culture, and reporting culture is your main defence against real attacks.
How do I measure whether my training is engaging?
Combine three hard numbers (completion rate, voluntary follow-up modules, report rate in phishing simulations) with two soft signals (per-module feedback, number of registered ambassadors). A rising number of ambassadors is a strong sign of resonance.
External source: NCSC - Awareness resources