← Other Blogs
Compliance
Security culture

What every regulation now requires from your cybersecurity training program — and why completion rates fail all of them

Modern cybersecurity regulations have shifted from "completion" to "competence," leaving organizations legally vulnerable when they prioritize annual check-boxes over actual behavioral change. Discover the five documentation gaps that fail regulatory scrutiny and how to build a training program that is truly defensible after an incident.
Written By:
Natalia Bochan

TL;DR

Every major cybersecurity regulation requires employee training. Most organizations deliver it and log the completion. And yet breaches that make headlines keep happening at regulated, audited, certified organizations — through exactly the channels training is supposed to close. The problem is not that organizations are skipping training. The problem is that regulations have moved toward an effectiveness standard, and most programs are still built around a completion standard. This post maps what each framework actually requires, where that gap appears, and what a defensible program looks like today.

The question neither the audit nor the certificate answers

In September 2023, attackers compromised MGM Resorts through a phone call. A 10-minute conversation with the IT helpdesk — classic social engineering — gave the Scattered Spider group everything they needed to initiate a ransomware attack that cost the company over $100 million and disrupted operations across its properties for days. (Source: Specops Software, 2023)

MGM is a PCI DSS-regulated organization. Payment card compliance explicitly requires security awareness training for all personnel.

In February 2024, Change Healthcare suffered the largest healthcare data breach in US history, affecting an estimated 190 million individuals. (Source: HHS OCR, hhs.gov) Change Healthcare held HITRUST certification and operated under HIPAA — a framework with explicit, documented security training requirements — at the time of the breach.

The question worth asking about both incidents is not whether training was completed. It almost certainly was; both organizations operated under frameworks that made it mandatory and auditable. The question is what the training was actually designed to measure — and whether that measurement aligns with what the regulations are increasingly asking for.

That gap is the subject of this post.

What regulators are actually measuring — and it is not completion

The dominant assumption in most compliance programs is that training requirements mean training delivery. Assign the module. Log the completion. Pass the audit.

That assumption is increasingly wrong. The regulatory language across NIS2, DORA, ISO 27001:2022, and updated enforcement guidance from HIPAA and NYDFS has shifted in a specific direction: from attendance to effectiveness, from completion to competence, from documentation of activity to evidence of behavioral change.

Here is what each framework actually requires.

NIS2: from awareness to effectiveness, with management accountability

NIS2 Directive, which came into force across EU member states in 2024, contains two separate training obligations that most compliance programs treat as one.

Article 21(2)(g) requires essential and important entities to implement "cyber hygiene practices and cybersecurity training" as part of their risk management measures. This is the clause most programs point to when they show module completion logs.

Article 20 is the one that gets missed. It requires management body members — the board, the C-suite, senior leadership — to undergo cybersecurity training specifically. Not a general awareness module sent to all staff. A distinct program, targeted at leadership, covering the cybersecurity risks and practices relevant to how they govern the organization.

ENISA's NIS2 Technical Implementation Guidance describes the expected standard as training that demonstrates effectiveness, not just delivery. Frequency is not mandated to a specific number, but ENISA guidance recommends annual training at minimum, supplemented by phishing simulations and updated content in response to new threat patterns. (Source: ENISA NIS2 Technical Implementation Guidance, enisa.europa.eu)

An organization that has 100% completion on an annual module but no management-specific training record, no simulation data, and no evidence of content updates tied to threat intelligence has not met the full scope of NIS2's requirements.

DORA: adequacy as a legal standard, not a target

The Digital Operational Resilience Act came into force on January 17, 2025. It applies to financial entities: banks, insurers, investment firms, payment processors, and their critical ICT third-party providers operating in the EU.

DORA requires organizations to provide ICT risk management training and awareness programs for all staff and management. The specific requirement is that training be "adequate" — not that it be completed.

Adequacy is a legal standard with implications. It means training must be proportionate to the actual ICT risk exposure of the role. It means a relationship manager who handles wire transfers faces different social engineering risk than a back-office analyst, and the training program must reflect that. Generic annual modules distributed uniformly across the organization are not adequate in the DORA sense. They may be compliant in a narrow, literal reading. They are not defensible under investigation.

When DORA's competent authorities review a financial entity following an ICT incident, inadequate training is explicitly one of the factors they can cite as evidence of deficient risk management — with consequences for enforcement and liability.

ISO 27001:2022: competence is not a certificate

ISO 27001:2022 Annex A.6.3 covers information security awareness, education, and training. The standard does not require annual training. It requires that organizations determine the necessary competence of persons whose work affects information security performance — and then take action to acquire or maintain that competence.

The word is competence. Not completion, not awareness, not attendance. Competence: the demonstrated ability to apply knowledge and skills to perform a task correctly.

An ISO 27001 auditor asking about Annex A.6.3 is not looking for a list of names and dates. They are looking for evidence of a process: how does the organization identify what competence is needed, how does it assess current competence levels, how does it close gaps, and how does it verify that gaps have been closed?

A completion log answers none of those questions.

NYDFS 23 NYCRR 500: training that keeps pace with threats

The New York Department of Financial Services Cybersecurity Regulation (23 NYCRR 500) requires annual cybersecurity awareness training for all personnel. This sounds like the most traditional of the frameworks — one training cycle per year, documented. But section 500.14(b) adds a specific requirement that most programs quietly fail: the training must include social engineering as a topic, and it must be updated based on the results of risk assessments.

That second clause is where compliance programs fall short. A static module purchased three years ago and re-assigned annually does not satisfy a requirement to update training based on current risk findings. If a risk assessment identifies that credential phishing via voice calls is an emerging exposure in the organization — as it demonstrably is for financial institutions — the training program must respond to that.

HIPAA: ongoing is not the same as annual

HIPAA's Security Rule requires covered entities to implement a security awareness and training program for all members of the workforce. Unlike some frameworks, HIPAA does not specify a frequency. What it does specify is ongoing: training must be updated and reinforced when working practices or technology change, when a risk assessment identifies a gap, and when HHS issues new guidance.

The practical implication is that a HIPAA-compliant training program is one that treats training as a continuous operational process rather than a calendar event. An organization that assigns training annually, runs no simulations, and does not update content when its technology environment changes is not meeting HIPAA's standard — even if every employee completed the module.

SOC 2: the auditor looks for evidence, not certificates

SOC 2's common criteria section CC1.4 requires organizations to demonstrate that personnel have the competence to fulfill their responsibilities, including those related to information security. SOC 2 auditors reviewing training programs examine records: who was trained, on what, when, and whether the training addressed the specific control environment. They also look for evidence of follow-up when risks are identified.

A completion log that shows 100% training assignment is a starting point, not a conclusion.

Summary: what each framework actually measures

Regulation Training Requirement Effectiveness Language Documentation Auditors Want
NIS2 (EU) All staff + management separately (Art. 20) Yes — cyber hygiene practices must be implemented effectively Completion records, management training records, simulation results, content update log
DORA (EU, financial) All staff and management, ICT risk-specific Yes — training must be "adequate" Role-based training records, evidence of risk-proportionate content
ISO 27001:2022 Competence-based, role-differentiated Yes — "competence" standard, not completion Competence gap assessments, training records, evidence of gap closure
NYDFS 23 NYCRR 500 Annual, must include social engineering Partial — must be updated per risk assessment Completion records, risk assessment integration, content update log
HIPAA Ongoing, updated per change or assessment Partial — "ongoing" implies more than annual Workforce-wide records, evidence of updates tied to risk findings
SOC 2 (CC1.4) Competency-based, aligned to control environment Partial — auditors look for evidence, not just logs Completion records, follow-up documentation, role-specific coverage

Why completion rates satisfy auditors but not regulators

There is a difference between passing an audit and being defensible after an incident.

Audits are typically conducted prospectively — before something goes wrong. Auditors review the documentation, confirm the process exists, and issue an opinion on whether controls are in place. A well-maintained completion log, a signed policy acknowledgment, and a training calendar satisfy most audit procedures.

Regulatory investigations run differently. They are triggered by incidents. They are conducted retrospectively — after something has already gone wrong. And they are asking a different question: given that this breach occurred through a human behavior channel, was the organization's training program adequate to prevent it?

That question exposes the gap. "Adequate" in the regulatory sense means the training addressed known risks, was updated in response to new threats, was proportionate to the roles involved, and generated evidence of behavioral change — not just attendance.

Human behavior remains central to how most incidents unfold. The 2024 Verizon Data Breach Investigations Report found that 68% of breaches involved the human element — error, social engineering, or misuse of credentials. (Source: Verizon DBIR 2024, verizon.com) This figure has held roughly constant across several reporting years. Annual training completion rates have not moved it.

The frameworks that now use effectiveness language — NIS2, DORA, ISO 27001 — are a direct regulatory response to this pattern. Policymakers have observed that mandatory training, measured only by completion, has not produced the behavioral outcomes it was intended to produce. The regulatory language is shifting in response.

The five documentation requirements that most programs miss

When a regulatory investigation follows an incident, or when a thorough ISO 27001 audit is underway, five documentation gaps appear most consistently. These are not edge cases. They are the gaps between a program built to check a box and one built to withstand scrutiny.

1. A risk-based content update log

Every framework with an effectiveness or adequacy standard — DORA, NYDFS, HIPAA, NIS2 — implies that training content responds to known and evolving risks, not just to the calendar. This requires a documented process: what risk assessment input triggered a content update, when was it updated, and what changed.

Without this log, the program cannot demonstrate risk responsiveness. It can only demonstrate that training was assigned.

2. Role-differentiated training records

NIS2 Article 20 requires management body training as a distinct obligation. ISO 27001:2022 requires competence to be determined by role. DORA requires adequacy relative to ICT risk exposure.

All three require the organization to track training at the role level, not just as an aggregate. A single log showing "all employees completed the annual module" does not demonstrate that the CFO received management-level cybersecurity training, that the system administrator received elevated-access training, or that the helpdesk team received social engineering simulation training proportionate to their exposure.

3. Phishing and simulation data tied to individual risk scoring

Simulation programs generate the behavioral data that effectiveness-oriented regulators are looking for. A phishing simulation that shows aggregate click rates tells you something. A program that tracks individual behavior over time, identifies persistent risk patterns by person or team, and feeds that data into adaptive training decisions tells you everything an auditor is looking for.

Aggregate simulation data passed the audit bar five years ago. The emerging standard asks for longitudinal behavioral measurement at the individual or role level.

4. A documented remediation path for repeat exposure

Regulatory frameworks do not only require training; they require the organization to respond to identified gaps. An employee who clicks every phishing simulation and completes the assigned remedial module without behavioral change is a documented risk. The program must show what the organization does next — whether that is targeted intervention, role reassignment, or escalation to the manager.

Without documented remediation logic, the program cannot demonstrate that it closes the loop between identified risk and organizational response.

5. Management training records with content specificity

For NIS2 Article 20, and its equivalents in DORA and ISO 27001, management-level training records must show what was covered — not just that a session occurred. The management training obligation exists because boards and executives make decisions that affect the security posture of the organization. Regulators want evidence that leadership understands the cybersecurity risks relevant to those decisions.

A calendar entry that says "cybersecurity awareness session, Q4" does not satisfy this. A record that documents participants, topics, governance-level risk scenarios covered, and follow-up actions does.

What a defensible training program looks like in 2026

A defensible training program is one that produces evidence of behavioral change, not just evidence of delivery.

That requires four structural shifts from how most programs are built today.

Continuous measurement replaces calendar events. A program that runs annual training and quarterly phishing simulations is better than one that runs annual training alone. A program that measures behavioral signals continuously — simulation performance, incident reporting rates, repeat failure patterns, role-based risk scores — can demonstrate ongoing effectiveness rather than point-in-time compliance. This is the direction regulators are signaling, and it is already explicit in DORA's "adequacy" standard and ISO 27001's "competence" standard.

Behavioral baselines make change visible. A compliance program that cannot show where an employee's behavior started and where it is now cannot demonstrate that training produced any effect. Establishing behavioral baselines by role, by team, and across the organization turns training from an input metric into an outcome metric. An auditor or investigator asking whether training was effective has an answer: here is where the risk score was, here is where it is now, here is what changed.

Risk integration keeps content relevant. Training content that does not respond to the organization's actual threat environment cannot meet the "adequacy" or "risk-based update" requirements that NIS2, DORA, HIPAA, and NYDFS impose. The documentation requirement — what risk signal triggered what content update — demands a live connection between the threat intelligence function and the training program. In practice, this means the security team and the learning program need to share data regularly, not communicate once a year during curriculum review.

Management accountability is documented separately. The NIS2 Article 20 obligation is a formal governance requirement, not an add-on to general awareness training. Treating it as one means the organization has a documented exposure in the event of an incident. Management training should have its own records, its own curriculum — focused on governance-level decisions, risk oversight, and regulatory accountability — and its own update cadence.

These are not aspirational characteristics of a high-maturity program. They are the characteristics a regulated organization needs to demonstrate when a regulator reviews its training program after an incident.

Completion was never the point

The frameworks that now govern cybersecurity training — NIS2, DORA, ISO 27001:2022, DORA, HIPAA, NYDFS — were not written to check whether modules were assigned and completed. They were written to require organizations to manage human behavior as a security control.

That means establishing competence, not logging attendance. It means updating programs in response to real risk, not in response to the calendar. It means documenting individual behavior over time, not aggregate completion rates.

Most organizations are running programs that answer the wrong question. Audits pass because auditors are reviewing process documentation, not behavioral outcomes. Incidents happen because human exposure — the gap between what people know and how they behave under pressure — was never actually closed.

Closing that gap is what regulation now requires. It is also what security outcomes require. Those two things have never been more aligned, and most programs have never been further from meeting either one.

Zepo Intelligence is building the closed-loop architecture that connects behavioral signals to adaptive learning — in real time, at the individual level. If your training program needs to move from completion records to behavioral evidence, let´s discuss how Zepo measures what regulators are starting to require.

Content
Act now before attackers do
Unify deepfake simulations, personalized training, and risk analytics into a single platform that builds measurable defense.
Talk to an expert