25
 min read

How to Evaluate the Long-Term Impact of a Security Awareness Program?

Learn how to measure, track, and improve the long-term impact of your security awareness program for sustained risk reduction.
How to Evaluate the Long-Term Impact of a Security Awareness Program?
Published on
September 10, 2025
Category
Cybersecurity Training

Beyond Check-the-Box: The Need for Lasting Security Awareness

Security awareness training has become a staple in organizational risk management. From mandatory annual phishing courses to compliance-driven workshops, companies invest time and resources into educating employees about cybersecurity. But after the quizzes are passed and certificates collected, a critical question remains: is the program actually making a long-term difference? Many organizations still gauge success by superficial metrics, for example, the percentage of employees who completed the training, which is merely a compliance checkbox. This approach reveals little about whether employees’ day-to-day security behaviors have truly improved. In fact, studies have found that while 84% of organizations aim to change employee behavior through awareness programs, far fewer consistently monitor if those behavior changes occur in practice. Simply put, checking the training box doesn’t guarantee a safer workforce when real threats emerge.

Evaluating the long-term impact of a security awareness program means looking beyond immediate test scores or attendance records. The real measure of success is whether employees internalize secure habits and sustain them over time, contributing to fewer security incidents and a stronger security culture. This is not an overnight task. Cultivating genuine security-minded behavior is a gradual process that requires ongoing reinforcement and measurement. Yet, many companies struggle with how to measure these human factors. As a National Institute of Standards and Technology (NIST) study noted, organizations often rely on easily collected metrics like training completion rates even though such compliance metrics “may not indicate whether employee security behaviors and attitudes have been positively changed”. Security leaders therefore face a dual challenge: prove to stakeholders that awareness training reduces risk in the long run, or risk losing support for the program. Gartner experts warn that if you can’t demonstrate a security awareness program’s effectiveness in reducing incidents, executive support may dwindle, jeopardizing future funding and participation.

The stakes are high. Cyber threats exploiting human error (phishing, weak passwords, social engineering, etc.) remain among the top causes of breaches across industries. A long-term, measurable improvement in security awareness can be the difference between blunting an attack or suffering a costly incident. This article outlines how HR professionals, business owners, and enterprise leaders can evaluate the long-term impact of their security awareness initiatives. We’ll discuss what to measure, how to track progress over time, and ways to translate those metrics into business value, ensuring your awareness program is not just a yearly formality, but a sustained driver of security culture and risk reduction.

Why Long-Term Evaluation Matters

Focusing on Outcomes, Not Just Activities: The primary goal of a security awareness program is to reduce an organization’s human cyber risk over the long haul, not merely to have employees complete training modules. Effective Cybersecurity Training goes beyond checking boxes — it focuses on measurable behavior change, ensuring employees not only complete lessons but also apply secure habits in their daily work. If we stop evaluation at activity metrics (e.g. 95% of staff took the training), we miss the real point: are those employees now behaving more securely day-to-day? Traditional measures like course completion rates or one-off phishing test scores, while easy to obtain, “do nothing to prove that the program is shifting the behavior of the workforce in a way that reduces cyber risk”. An employee might ace a security quiz immediately after training, yet still fall for a phishing email six months later. Long-term evaluation forces us to examine whether knowledge is retained and translated into practice over time.

Sustaining Executive and Employee Buy-In: Evaluating impact isn’t just an academic exercise, it’s crucial for maintaining support. Security awareness programs often require ongoing budget, time allocation, and leadership advocacy to continue year after year. If you can’t demonstrate tangible improvements (like fewer incidents or higher reporting of threats), executives may question the value of the program. Front-line employees might also lose interest if they don’t see the relevance. By contrast, when you track meaningful outcomes, say, a steady decline in phishing click rates or a rise in incident reporting, you create a compelling narrative that the program is working. This evidence helps reinforce management support and justifies the resources devoted to training. In essence, long-term metrics provide the “proof in the pudding” that transforms security awareness from a perceived compliance cost into a business enabler that protects the company’s bottom line.

Preventing Complacency and Blind Spots: Another reason to evaluate over the long term is to avoid the trap of false confidence. It’s possible for an organization to meet all its training targets and still suffer a major breach due to human error. In one industry report, 61% of security teams invested in training mainly to meet regulations, yet real incidents continued unabated. Without ongoing measurement, organizations might wrongly assume their workforce is “aware” after training, while attackers continue to exploit unaddressed weaknesses. Long-term evaluation uncovers these gaps. For example, you might find that while initial phishing simulation failures dropped, they plateaued after a few months, signaling the need for new strategies to further improve. Or you might discover certain departments regressing in their security practices over time, indicating where to target refresher efforts. Continuous evaluation creates a feedback loop, ensuring the program adapts and remains effective against evolving threats and employee habits.

In summary, long-term evaluation matters because it shifts the focus from ticking a training box to truly reducing risk and building a security-resilient culture. It provides evidence to stakeholders that security awareness isn’t just about education, it’s about sustained behavior change that protects the organization. With the “why” established, let’s look at what exactly we should measure to gauge that change.

Key Metrics to Gauge Security Awareness Impact

Identifying the right metrics is the heart of evaluating a security awareness program. These metrics should extend beyond basic training statistics and instead capture changes in behavior, competency, and security outcomes. Below are key categories of metrics and indicators that, together, paint a picture of long-term impact:

  • Participation and Completion Rates: These are the most basic metrics, e.g. percentage of employees who finished mandatory training or attended security workshops. While they demonstrate reach and compliance, by themselves they say little about effectiveness. However, they are still a useful starting point. Low participation could indicate obstacles (e.g. scheduling, engagement issues) that need addressing. High completion is necessary to influence behavior, but not sufficient.
  • Knowledge Retention and Attitudes: Quizzes and surveys can measure how well employees retain key concepts and their attitudes toward security. For instance, comparing quiz results immediately after training and a few months later tests knowledge retention. Attitudinal surveys might ask employees how confident they feel about spotting threats or whether they find the training helpful. Positive responses can signal a cultural shift in how security is valued. Just be cautious: knowing what to do doesn’t always guarantee doing it under pressure. A sizable gap often exists between awareness and action, as illustrated by one finding that 70% of individuals recognize the risks of unknown email links, yet many still click them. Thus, pair knowledge metrics with behavioral metrics.
  • Behavioral and Performance Metrics: These are arguably the most critical indicators of long-term impact. They measure what employees actually do when faced with security decisions. Common examples include:
    • Phishing Simulation Results: Track the phishing email click rate over time (what percentage of employees fall for simulated phishing attempts) and the reporting rate (how many employees report the suspicious email to IT). A declining click rate and rising reporting rate over successive campaigns is strong evidence of improved vigilance. For example, organizations have found that after comprehensive training, users are about 30% less likely to click on phishing links compared to before.
    • Real Incident Reporting: Monitor how often employees report actual security incidents or near-misses (phishing emails, suspicious phone calls, lost devices, etc.). Initially, reporting rates tend to be very low, one analysis showed only 3% of employees report phishing emails to management in the absence of robust awareness efforts. If that number grows over time, it indicates a culture where employees take initiative in security (a very positive sign). In fact, higher reporting can correlate with catching incidents earlier, as one large-scale training program found: about half of employees had reported a real phishing threat within six months of ongoing training, and two-thirds within a year.
    • “Repeat Offender” Tracking: Identify individuals or departments who repeatedly make security mistakes (e.g. clicking multiple phishing emails, failing multiple quizzes). A reduction in the number of repeat offenders after targeted training is a meaningful metric. For instance, a case study at Qualcomm (a global tech company) used focused training for their most at-risk group and achieved a 63% reduction in repeat clickers, meaning far fewer people kept falling for phish after the intervention. Those high-risk users also improved their overall security performance by 46% in six months, a dramatic behavioral improvement that underscores the value of measuring and addressing the needs of the riskiest users.
    • Secure Behavior Assessments: Depending on your program’s scope, metrics can target various behaviors. If you train on physical security, you might measure how many employees tailgated through doors or left sensitive documents out (“clean desk” checks). If you emphasize data protection, track incidents of data mishandling. One security firm recommends aligning metrics to the organization’s top human risks, for example, if data leakage is a concern, monitor incidents of improper document disposal or unencrypted device use. The key is that these metrics directly tie to behaviors the training is intended to improve.
  • Security Incident Rates: Ultimately, the clearest long-term metric is a reduction in actual security incidents (or their severity) attributable to human error. While many factors influence incidents, you can specifically look at those where employee action/inaction played a role, e.g. phishing-related breaches, malware infections from unsafe downloads, data loss from negligence. Trending these incident rates over years is powerful. A well-run awareness program should contribute to lowering such incidents. Industry statistics are encouraging on this front: companies that consistently engage in security awareness training see on average a 70% reduction in security incidents compared to before. In other words, a strong program can potentially eliminate the majority of avoidable incidents, saving the business from costly breaches.
  • Culture and Engagement Indicators: These are more qualitative but still measurable. They include things like the level of employee engagement in optional security activities (participation in voluntary trainings, attendance in security events or competitions, questions asked to security teams). Another indicator is the presence of a security-conscious mindset in everyday work. Some organizations use culture surveys or focus groups periodically to gauge this. Signs of success might be employees proactively sharing security tips with peers, or routinely double-checking procedures without being told. While harder to quantify, these cultural shifts often manifest from the cumulative effect of training. For example, when employees start casually saying “Hey, this email looks phishy, I’ll report it” or reminding each other about protocols, it reflects a security-first culture taking root. You can capture some of this through surveys or by tracking anecdotes and feedback over time.
  • Connecting Metrics to Goals: It’s important to choose metrics aligned with your program’s goals and the risks you care most about. A common pitfall is measuring only what’s easy rather than what matters. For instance, if your goal is to reduce data breaches caused by phishing, then phishing click rates, reporting rates, and related incident counts are vital metrics. If building a “see something, say something” culture is a goal, then track how often employees speak up about security issues. As Gartner’s guidance suggests, define a clear security awareness vision (e.g. “We are a security-conscious workforce”) and then identify signature behaviors that would exemplify that vision (e.g. “Employees report suspicious emails promptly”). Your metrics should then measure those signature behaviors (like the number of phishing emails reported, in this example). By doing so, you ensure the metrics aren’t just numbers in isolation, but directly tied to desired outcomes for the business.

Tracking Progress Over Time

Once you have defined what to measure, the next step is to implement a system for tracking these metrics over an extended period. Long-term impact cannot be assessed from a single snapshot; it emerges in trends and patterns across months and years. Here are strategies for effectively tracking and analyzing progress over time:

Regular Data Collection Cadence: Establish how frequently you will collect each metric. Some data can be captured continuously or in real-time (e.g. automatic logging of phishing simulation results every time a campaign runs). Others might be periodic, for example, conduct quarterly phishing tests, or an annual security culture survey. It’s often useful to have a mix of frequent indicators (like monthly phishing click rates) and longer-term checkpoints (like yearly incident rate comparisons). The key is consistency. If metrics are gathered haphazardly or infrequently, it will be difficult to spot true trends. Consider aligning data collection with your organizational rhythm, many companies do phishing simulations monthly or quarterly, and a comprehensive awareness report annually.

Baselining and Benchmarks: Start by capturing a baseline before or at the onset of improvements. For instance, record your phishing click rate and reporting rate at the program’s launch (or use the first simulation’s results as baseline). Similarly, note the number of security incidents in the year prior to the training rollout. This baseline will be your point of comparison to quantify improvement. Additionally, you can benchmark against peers or industry standards. Resources like the SANS Institute’s Security Awareness Maturity Model or industry reports can provide reference points (e.g., average click rate in similar organizations). One benefit of benchmarking is it can motivate leadership support, organizations that compare their awareness metrics with industry peers often gain more executive buy-in to improve and excel.

Trend Analysis: As data points accumulate, analyze the trajectory. Are things moving in the desired direction? For example, if your quarterly phishing simulations show click rates dropping from 20% to 10% to 5% over a year, that’s a clear positive trend. On the other hand, if progress plateaus or reverses (say, click rates drop initially but then stall, or start rising again), that’s a flag to investigate. Always look at a sufficient time window, one bad month might be an outlier, but a three-quarter stagnation is significant. It’s helpful to visualize the trends using charts, which can make patterns readily apparent and are excellent for reporting to stakeholders. Many security awareness platforms provide dashboards for this; if not, even a simple spreadsheet chart can do the job.

Beyond “One Size Fits All” Metrics: Over time, you might also segment the data to get deeper insights. For instance, track metrics by department, role, or region. You may find that certain teams have drastically lower awareness (e.g. perhaps the Sales department consistently has higher phishing click rates than IT). This granularity allows targeted interventions. It also acknowledges that the impact of training may differ among groups, and you might need to adapt content or frequency accordingly. Tracking sub-trends helps ensure no particular segment of the workforce remains a weak link over the long term.

Tools and Automation: Leverage tools to automate and simplify data gathering where possible. Modern phishing simulation and training platforms automatically log who clicked, who reported, time to report, etc. Incident management systems can be a source for tracking reports made by employees. Learning management systems (LMS) provide training completion and quiz scores. Aggregating data from these sources into a central report (perhaps using business intelligence tools or even basic scripts) can save a lot of manual effort. Some organizations also use periodic security drills or tests, for example, unannounced clean desk inspections or USB drop tests, to generate data on physical security behaviors. Ensure any such tests are conducted ethically and with leadership knowledge. The aim is to simulate real conditions and see if good practices are holding up over time.

Patience and Long-Term Mindset: One practical point, meaningful long-term changes will take time to manifest. Avoid the temptation to declare victory or failure too early. For example, if after one quarter post-training you don’t see a huge change, that doesn’t necessarily mean the program failed; habits might just need more reinforcement. Look for improvement over multiple periods. Gartner recommends that if certain outcome metrics show no improvement over, say, two or more reporting periods, then you should indeed reassess your training approach. But give the program at least a few cycles to catch on. Remember that security habits, like any habits, require continuous reinforcement. The value of metrics comes from how they trend over time rather than any single data point. This long view will help distinguish true improvement from short-term fluctuations.

Adjusting the Course Mid-Stream: The beauty of ongoing tracking is that it enables mid-course corrections. You need not wait a full year to tweak the program if something’s not working. Suppose your data after six months shows that while most employees improved, a small core of “repeat clickers” remains. You can introduce a special coaching session or more personalized training for that group in month seven, then monitor the effect in subsequent months. Or if reporting rates aren’t increasing as hoped, you might launch a campaign to remind and incentivize reporting (perhaps even gamify it) and then see if the numbers budge. Continuous monitoring and agile adjustments go hand-in-hand to maximize long-term impact.

By diligently tracking these metrics and trends, you gain a clear window into the trajectory of your security awareness efforts. The data will tell a story, perhaps of significant improvement, or perhaps of areas needing more work. Either way, it arms you with knowledge to make evidence-based decisions. Next, we’ll discuss how to act on these insights to refine your program and to demonstrate its value in business terms.

From Data to Action: Refining Your Program

Gathering metrics is only half the battle; the ultimate goal is to translate those insights into actions that strengthen security and reduce risk. Long-term evaluation should be an iterative loop where data informs improvements to the program, and those improvements in turn drive better outcomes. Here’s how to make that happen and also how to communicate the value of your efforts:

Identify What’s Working (and Do More of It): Analyze your metrics to pinpoint success stories. Did the introduction of interactive phishing simulations coincide with a sharp drop in phishing clicks? Have departments with security champions (peer advocates) shown faster improvement than those without? Recognizing these wins lets you double down on effective strategies. For example, if quarterly refresher trainings are yielding continual knowledge gains, you might decide to keep that cadence. In one instance, an organization noticed that after adding role-based, personalized training content for high-risk users, those users’ security behavior metrics improved dramatically (as seen in the Qualcomm case). The lesson learned was that tailoring content to context significantly boosts effectiveness, a practice worth expanding to other groups. Use your data to celebrate milestones too: letting employees know, “Phishing clicks are down 50% this year, great job!” can boost morale and reinforce positive behaviors.

Address Gaps and High-Risk Areas: On the flip side, metrics will also uncover weak spots. Treat these as opportunities for enhancement. If only 3% of employees are reporting phishing attempts and the number barely moves quarter after quarter, that’s a clear indicator that more emphasis is needed on reporting procedures and the importance of speaking up. You might introduce easier reporting tools (like a one-click email reporting button) or run an internal campaign highlighting how reporting prevented a real incident. If certain topics show poor quiz results months later, consider adding engaging follow-up content on those topics. The data might show, for instance, that while phishing awareness is improving, awareness around safe data handling is not, perhaps because the training focused heavily on phishing and less on data security. In response, you could include new modules or workshops targeting data protection behaviors (like proper document disposal, use of encryption, etc.), then monitor those metrics thereafter. Continuous improvement means the program’s curriculum and tactics evolve based on evidence, making the training more dynamic and relevant than a static annual slideshow.

Expand Metrics Beyond Phishing: A common pitfall is concentrating solely on phishing metrics because they are readily measurable. However, long-term security resilience involves multiple human risk areas. As your program matures, consider broadening the scope of what you measure and train. For example, social engineering via phone (vishing) or text (smishing) might be rising threats, you could simulate those and track outcomes. Physical security compliance (like badge use, tailgating incidents) could be another dimension. A 2022 study advises going beyond phishing assessments when measuring a security awareness program, to include metrics tied to various key risks relevant to your business. If your organization handles sensitive customer data, perhaps track the number of data mishandling violations or unauthorized access attempts by employees. By diversifying metrics, you get a more comprehensive view of security behavior and can address vulnerabilities that a narrow focus might miss.

Link Metrics to Business Outcomes: To convincingly demonstrate the program’s value, translate security awareness metrics into the language of business risk and return. Executives respond to outcomes like reduced incidents, cost savings, risk reduction, and compliance fulfillment. Use your long-term data to make these connections explicit. For instance: “Over the past year, the rate of malware infections via phishing dropped from 5 incidents per month to 1 per month, likely avoiding an estimated $X in incident response and downtime costs.” Or: “Our employee reporting of potential breaches increased 4x, enabling IT to neutralize threats faster and preventing potential data loss.” If you can correlate your timeline of training improvements with a decline in security incidents, it provides a compelling story that awareness efforts contributed to that decline. In practice, it’s hard to assign a precise dollar value to an avoided breach, but even approximate metrics (e.g. comparing average breach costs to the incidents you believe were averted) can underscore financial impact.

Additionally, highlight any compliance and legal benefits from sustained training. Many industries require security training, by exceeding mere compliance and showing actual risk reduction, you position the organization as proactively lowering liability. This can be an angle in audits or in cyber insurance negotiations (some insurers may give better terms to organizations that demonstrate strong security culture metrics).

Report in a Meaningful Way: When communicating to senior management or the board, keep the focus on a few high-impact metrics and the trendlines. Leadership talks in metrics, but they care about the bottom-line implications. Rather than inundating them with dozens of stats, pick the ones that matter most: e.g., “phishing email click rate dropped from 15% to 3% in 18 months” and “security incidents with human error as a root cause fell by half year-over-year.” Use visuals like charts to make trends clear. Pair the metric with a short narrative: “This indicates a substantial decrease in our breach likelihood, as employees are now far less prone to fall for attacks.” By tying your awareness program results to risk reduction, cost avoidance, and regulatory peace of mind, you make it real for business leaders.

Don’t shy away from also reporting on what isn’t improving, but frame it with an action plan. For example: “Only 40% of employees passed the surprise USB-drop test, unchanged from last quarter. Action: We are introducing a new interactive module on device security and will re-test in two months.” This shows a mature program that is data-driven and solution-oriented, which management will appreciate.

Reinforce and Reward: Use what you’ve learned to reinforce good behavior among employees. Share anonymized results or general achievements with staff: people love to see progress and to know their efforts made a difference. Some organizations introduce rewards or recognition for teams with exemplary security behavior, for instance, praise the department with the most phishing reports or the one with 100% training completion and strong quiz performance. Positive reinforcement can boost engagement with the program, creating a virtuous cycle of improvement. Moreover, involving employees in the results (e.g., “We’ve improved as a company, but let’s aim even higher next year”) helps nurture the collective sense of ownership of security. Over time, security awareness stops feeling like something imposed on employees, and starts feeling like an integral part of their professional development and the company’s success.

Iterate the Program Design: Finally, long-term evaluation may lead you to evolve your program’s design entirely. Perhaps you discover that micro-learning (short, frequent trainings) works better for retention than a single annual session, your data on knowledge decay could support this. Or you may decide to implement a tiered training approach, where high-risk roles get more intensive training. Continual metrics give you the evidence to experiment and innovate. This adaptive approach ensures the program stays fresh and effective year after year, rather than stagnating. Security threats change, and so do workforces (with turnover, new generations of staff, etc.), so a program that continuously refines itself based on feedback will remain relevant and ahead of the curve.

In summary, closing the loop from data to action is what turns an average security awareness program into an excellent one. By acting on what you learn, amplifying strengths, fixing weaknesses, and communicating value, you create a self-improving cycle. Over the long term, this leads not only to metrics moving in the right direction, but to the deeper goal of ingraining security into the company culture.

Final Thoughts: Building a Security-First Culture

Evaluating the long-term impact of a security awareness program is ultimately about measuring culture change. The endgame is a workplace where secure behavior is second nature, where employees, without thinking, pause to verify a strange request, report anomalies, and make careful choices online. Reaching this state requires patience, commitment, and yes, metrics. By focusing on meaningful indicators and tracking them over time, organizations can ensure they are not just conducting training for training’s sake, but truly changing mindsets and habits.

Long-term measurement acts as a compass. It tells you if your awareness efforts are on course towards that security-first culture. It can reveal subtle shifts, the kind that don’t show up in a one-time test but become evident in trend lines and year-over-year comparisons. Perhaps you’ll notice that cybersecurity conversations become more common among staff, or that employees start taking initiative in security improvements (signs of a mature security culture). These qualitative changes often follow the quantitative improvements that your metrics capture. For example, steadily rising phishing reporting rates and declining incidents will eventually manifest as a workforce that approaches emails with caution and feels proud of catching threats. The metrics are the measurable proof of this cultural evolution.

For HR professionals and business leaders, the journey doesn’t end with presenting an annual report of numbers. The insights from evaluation should feed back into broader organizational strategies, from how you onboard new employees with security in mind, to how you integrate security objectives into performance goals. A strong security culture can even become a selling point in talent retention and client trust, as it demonstrates a company that cares about doing things right. By persistently evaluating and improving your security awareness program, you send a message that security is not a one-time drill, but an ongoing priority woven into the company’s fabric.

In closing, a security awareness program’s true success is measured over the long run. With diligent evaluation, you gain the power to steer human behavior in a safer direction, validate the impact to stakeholders, and adapt in the face of new challenges. Over time, the data will show, and your own experience will confirm, that an educated, vigilant workforce is one of the best defenses an organization can build. The process may be gradual, but the payoff is a resilient security culture that stands strong against threats, year after year.

FAQ

What is the long-term goal of a security awareness program?

The ultimate goal is to reduce human-related cyber risks by fostering lasting behavior changes, not just meeting compliance. This means employees consistently applying secure habits in daily work, resulting in fewer incidents and a stronger security culture over time.

Why are training completion rates not enough to measure success?

Completion rates show participation but don’t reveal if employees retained the knowledge or applied it in real situations. An employee might pass a quiz but still fall for phishing months later. Long-term evaluation focuses on sustained behavior change and reduced incidents.

What metrics best measure the long-term impact of security awareness?

Key metrics include phishing simulation click and reporting rates, repeat offender reduction, real incident reporting frequency, data mishandling incidents, and security culture survey results. These give a clearer picture of behavior change and risk reduction.

How often should organizations track awareness program metrics?

Metrics should be tracked consistently, such as monthly phishing simulations, quarterly behavior checks, and annual culture surveys. Establishing a baseline and monitoring trends over months or years helps identify genuine progress and areas needing improvement.

How can organizations use metrics to refine their security awareness program?

Metrics help identify what’s working and where gaps exist. For example, if phishing reporting rates remain low, organizations can introduce easier reporting tools or run targeted campaigns. Data-driven adjustments keep the program relevant, effective, and engaging.

References

  1. Jacobs JL, Haney JM, Furman SM. Measuring the Effectiveness of U.S. Government Security Awareness Programs: A Mixed-Methods Study. National Institute of Standards and Technology.
    https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=934952
  2. Haney J, Lutters W. Security Awareness Training for the Workforce: Moving Beyond “Check-the-Box” Compliance. IEEE Computer 2020;53(10). https://pmc.ncbi.nlm.nih.gov/articles/PMC8201414/
  3. Addiscott R. 3 Ways to Assess the Effectiveness of Security Awareness Training. Cybersecurity Dive (Gartner). https://www.cybersecuritydive.com/news/gartner-security-awareness-training/601735/
  4. Keepnet Labs. 2025 Security Awareness Training Statistics and Trends. Keepnet Labs Cybersecurity Blog.
    https://keepnetlabs.com/blog/security-awareness-training-statistics
  5. Briscoe J. We Trained 3 Million Employees: How Effective Is Security Awareness Training? Hoxhunt Blog.
    https://hoxhunt.com/blog/how-effective-is-security-awareness-training
  6. McWherter J. Measuring the Impact of a Security Awareness Program. TrustedSec Blog. https://trustedsec.com/blog/measuring-the-impact-of-a-security-awareness-program
Weekly Learning Highlights
Get the latest articles, expert tips, and exclusive updates in your inbox every week. No spam, just valuable learning and development resources.
By subscribing, you consent to receive marketing communications from TechClass. Learn more in our privacy policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Explore More from L&D Articles

5 Things Your Organization Must Know About AI
April 8, 2025
23
 min read

5 Things Your Organization Must Know About AI

Discover 5 crucial insights on AI for leaders, HR, and CISOs, covering opportunities, risks, ethics, and workforce readiness.
Read article
Onboarding for Global Teams: Overcoming Time Zones and Cultural Barriers
May 20, 2025
29
 min read

Onboarding for Global Teams: Overcoming Time Zones and Cultural Barriers

Overcome time zone and cultural barriers in global onboarding with strategies for inclusion, connection, and productivity.
Read article
The Importance of Employee Cybersecurity Training in Your Small Business
April 7, 2025
12
 min read

The Importance of Employee Cybersecurity Training in Your Small Business

Learn why cybersecurity training is vital for small businesses and how it helps reduce risks, prevent breaches, and build a secure workforce.
Read article