Study concludes cybersecurity training doesn’t work (www.kpbs.org)
from cm0002@infosec.pub to cybersecurity@infosec.pub on 02 Nov 03:22
https://infosec.pub/post/37047749

#cybersecurity

threaded - newest

xxce2AAb@feddit.dk on 02 Nov 03:38 next collapse

That’s a shame, although I unfortunately have no problem believe that’s the case in general. I still personally benefit from the social engineering resistance training I’ve had over the years to this day though.

stinky@redlemmy.com on 02 Nov 04:47 next collapse

My toxic trait is believing that not answering the phone from unknown numbers is protecting myself from outside attackers

xxce2AAb@feddit.dk on 02 Nov 05:05 next collapse

It might be rudimentary, but I wouldn’t say you’re wrong.

Alternatively, pick up but answer the phone only with the word “Yes?”, “Speak” or “You may proceed” (preceded by ‘this line is now secure’).

Then, when they ask “who is this?” answer that “if you don’t know, you have the wrong number” and that “this call is currently being traced, pending review of a ‘military tribunal’.”

Do this with the flattest intonation you can manage.

That tends to get to them.

Goretantath@lemmy.world on 02 Nov 06:54 collapse

Nah, there’s AI that can clone your voice from a single word, not answering is the safest.

xxce2AAb@feddit.dk on 02 Nov 06:55 collapse

Point. Silence is good too.

WanderingThoughts@europe.pub on 02 Nov 09:22 next collapse

Recently there were recruiters on LinkedIn freaking out that when they called someone, they would answer with “Hello?” and the recruiter thinks they’re too good to be greeted with that.

qweertz@programming.dev on 02 Nov 13:18 collapse

My SIM provider has the option to not even route unknown callers to my device. Not that I get any, but just in case, even if it is not that common in Germany.

What some family of mine had to go through was social engineering harassment calls with some BS reasoning to get them to say “Yes”/“I agree” or something like that

bamboo@lemmy.blahaj.zone on 02 Nov 05:07 collapse

I still personally benefit from the social engineering resistance training I’ve had over the years to this day though.

Me too, I use it to get out of situations I don’t want to deal with. “Ohh you’re calling me asking for PII? Sorry, i can’t provide that information unless I initiate the conversation. I’ll call the number I have on file for you to provide that.” <hangs up and never follows up>

xxce2AAb@feddit.dk on 02 Nov 05:11 collapse

That’s the spirit! “I’m not at liberty to provide that information” is one of my favorite sentences.

TheAsianDonKnots@lemmy.zip on 02 Nov 03:48 next collapse

Isn’t any training better than no training?

kopasz7@sh.itjust.works on 02 Nov 03:53 next collapse

Maybe not if it only gives a false sense of security.

14th_cylon@lemmy.zip on 02 Nov 04:52 collapse

no. training costs time and money, so if it has zero effect, then no training is clearly better.

TheAsianDonKnots@lemmy.zip on 02 Nov 05:04 collapse

I guess I don’t understand the metric of success. My training at work has helped me recognize risks more than most of my family that has no idea what root domain URL scam is. Did most of my family fail? Yes. Did 20% learn something and avoid risk? Yes.

In large companies the training is for liability purposes, “see they all passed their tests, we tried to warn them”. People are always going to be the attack vector, that’s unavoidable… but 20% success is better than 0% success. As an admin, if I received a 20% spike in phishing reports, that’s statistically significant and should be looked into and stopped (proxy violation).

Cost of training is unavoidable and budgeted for.

CH3DD4R_G0BL1N@sh.itjust.works on 02 Nov 13:52 next collapse

Yeah it’s been a few years and I don’t remember what at this point, but my training has taught me a new scam or two before.

14th_cylon@lemmy.zip on 02 Nov 15:46 collapse

I guess I don’t understand the metric of success.

i guess you will find if you read the study mentioned in the article.

it is certainly possible that the study, or its interpretation in the article, is bs - i did not read either one of them. i am just stating in the vacuum that if something does not work (which is what that headline presents as conclusion of the study), then wasting time and money on it is worse than doing nothing.

MajorHavoc@programming.dev on 02 Nov 03:52 next collapse

I would be more interested in a study of people entering credentials or taking other risky actions after clicking.

Yes, people whose job includes lots of link clicking are going to click links.

And one obvious but good conclusion: invest in mandating MFA for sensitive actions.

14th_cylon@lemmy.zip on 02 Nov 04:50 next collapse

mfa is not going to help when people will literally transfer their money to a scammer, because the scammers convinced them that said money are in danger and only way to protect them is to transfer them to “secure account”. you can’t fix stupid with technical limitations.

bamboo@lemmy.blahaj.zone on 02 Nov 05:25 collapse

Totally agreed, I get it’s easier to consider it a fail if you open the link, and that simply opening a random link has some inherent risk, but there should at least be a fake page to enter credentials and evaluate how many people actually go through with that, and break that out as a CRITICAL where the other clicks are HIGH or MEDIUM status, to classify the risk.

Also, this is just an anecdote, but in a similar phishing simulation i helped with, we had to bypass filters for rejecting emails with links for websites registered in the last 60 days. Obviously this isn’t a foolproof way to prevent phishing attempts, but it does cut out a lot of junk, and we’ve indirectly been training employees to not deal with that.

qjkxbmwvz@startrek.website on 02 Nov 05:15 next collapse

When the son of the deposed King of Nigeria emails you directly asking for help, you help. His father ran the freaking country, okay?

bamboo@lemmy.blahaj.zone on 02 Nov 05:16 next collapse

Abstract from the paper itself:

This paper empirically evaluates the efficacy of two ubiquitous forms of enterprise security training: annual cybersecurity awareness training and embedded anti-phishing training exercises. Specifically, our work analyzes the results of an 8-month randomized controlled experiment involving ten simulated phishing campaigns sent to over 19,500 employees at a large healthcare organization. Our results suggest that these efforts offer limited value. First, we find no significant relationship between whether users have recently completed cybersecurity awareness training and their likelihood of failing a phishing simulation. Second, when evaluating recipients of embedded phishing training, we find that the absolute difference in failure rates between trained and untrained users is extremely low across a variety of training content. Third, we observe that most users spend minimal time interacting with embedded phishing training material in-the-wild; and that for specific types of training content, users who receive and complete more instances of the training can have an increased likelihood of failing subsequent phishing simulations. Taken together, our results suggest that anti-phishing training programs, in their current and commonly deployed forms, are unlikely to offer significant practical value in reducing phishing risks.

And the methodology:

Our study analyzes the performance of nearly 20,000 full-time employees at UCSD Health across eight months of simulated phishing campaigns sent between January 2023 and October 2023. UCSD Health is a major medical center that is part of a large research university, whose employees span a variety of medical roles (e.g., doctors and nurses) as well as a diverse array of “traditional” enterprise jobs such as financial, HR, IT, and administrative staff. For their email infrastructure, UCSD Health exclusively uses Microsoft Office 365 with mail forwarding disabled. On roughly one day per month, UCSD Health sent out a simulated phishing campaign, where each campaign contained one to four distinct phishing email messages depending on the month. Each user received only one of the campaign’s phishing messages per month, where the exact message depended on the group the user was randomly assigned to at the beginning of the study (§ 3.1). In total these campaigns involved ten unique phishing email messages spanning a variety of deceptive narratives (“lures”) described in Section 3.2. All of the phishing lures focused on drive-by-download or credential phishing attacks, where a user failed the phishing simulation if they clicked on the embedded phishing link.

TORFdot0@lemmy.world on 02 Nov 16:52 collapse

I guess the point is that users who are taking training are not more likely to pass the phishing simulations but I think that’s missing point. In competently ran organizations the point of these trainings aren’t explicitly to teach people to not fall for tests but to be able to identify which users are your greatest risks and either give them more support or can them if they are to high of a risk that it outweighs their productivity.

Of course the people who are taking more training are failing tests. It’s because they lack the computer skills or cognitive ability to understand what they doing. But taking a five minute training that says “don’t click the link” isn’t going to magically make people not get phished, but it has usefulness in basic awareness (which is why we have the super basic cyber security awareness training as well)

The reality is that all human beings can be socially engineered if the attacker is motivated enough. You can’t stop it by training only by planning and being proactively prepared

Horsecook@sh.itjust.works on 02 Nov 08:35 next collapse

I wonder if the efficacy of training could be improved if employees were fired for failing phishing tests.

No1@aussie.zone on 04 Nov 00:13 collapse

I have sent emails to the composer of the email asking them to call me because I can’t tell if it’s a phishing email or not.

In some companies, half of their processes link to external services. It’s impossible to tell what’s legit or not.

furrowsofar@beehaw.org on 02 Nov 12:28 next collapse

Ironic thing a company I use to work for would send out both email you need to click links to do your job then do training to not click links or even open the same kind of email. Then even test that by seeding in very realistic test email. Total stupidity. Your expected to tell the difference when there is no way to do so. The training was more CYA then anything, just blame the employee for shit company processes and security.

CompactFlax@discuss.tchncs.de on 02 Nov 15:34 next collapse

I got some emails about required training from outside the company. I needed to download and complete a PDF, which had links to other forms to complete, all offsite. I do know with certainty that the email was legit, but I reported as phishing. Still haven’t heard back about this critical training attestation, so I assume their tracking is as awful as the process.

It’s not my ass on the audit finding. Fix your shit.

Pulptastic@midwest.social on 02 Nov 17:23 next collapse

I report emails that I know are legit if it fails the phishing rules. Best example is unprompted emails from third party services that I know my company is using. If I don’t get a real email from a real employee either including the link or warning me that a valid third party link is coming, I’m not going to click it.

Make your shit legit or I’m not gonna do it.

furrowsofar@beehaw.org on 03 Nov 00:40 collapse

This is exactly it. Out sourced stuff that there is no way to verify. I stopped clicking on this stuff too unless I had to. Was still never sure.

bamboo@lemmy.blahaj.zone on 02 Nov 18:29 next collapse

It’s also such a dumb metric because most of people’s jobs are to click on links elsewhere on the internet, yet when it’s in an email, it’s bad? Unless you’re running an old browser or there is a 0 day, simply opening a link isn’t going to hack your system, but further actions by the user would need to be taken to be compromised. These simulations don’t account for that.

furrowsofar@beehaw.org on 03 Nov 00:43 next collapse

The real idiotic thing is a network where one client system compromise compromises the whole company. Bad network design.

sirblastalot@ttrpg.network on 03 Nov 22:33 collapse

Clicking the link hypothetically confirms to the spammer that yours is a valid and monitored email address, and that you’re a sucker suitable for more targeted phishing.

Of course, it seems like every random user will also happily type their password into any text box that asks for it, too.

bamboo@lemmy.blahaj.zone on 03 Nov 22:41 collapse

Unless the email client is blocking external images, a tracking pixel in the email would be enough to see that the email was rendered, and that the address is valid. The trainings specifically instruct you to review the contents of the email and check the email headers before clicking links, so that alone would confirm to a spammer that the email is valid.

sirblastalot@ttrpg.network on 03 Nov 22:31 next collapse

One time I failed a phishing test because I did a message trace and confirmed that it originated from our own internal servers.

KairuByte@lemmy.dbzer0.com on 04 Nov 00:48 collapse

I’ve “failed” phishing tests because my email client loads images by default. The way they set it up.

I reported the “you failed a phishing test” email as a phishing attempt, and funnily enough they backed off on the “mandatory training”.

Bottom line, don’t set your employees up for failure. Even the tech literate are going to fail if that how you set shit up.

[deleted] on 02 Nov 18:03 next collapse

.

shalafi@lemmy.world on 02 Nov 20:54 next collapse

Perhaps because corporate security training is boring as hell?

I worked up a training class over the course of a year. Ridiculous to take so long, but I wanted to nail it. I figured there were three key things.

  1. The things I talked about had to be relevant to the employees. I pared the stories down to items they could actually encounter. This is how an attack can affect you, how it can affect us. Here are things I’ve seen right here at our business.

  2. Anything I wanted to talk about had to come with actionable prevention techniques. Here’s the problem, here’s what you can do about it. They had to feel empowered, not helpless.

  3. The class had to be entertaining and interesting, start to finish, no fumble fucking around, no baffling them with jargon. I rehearsed that entire year until I could do it in my sleep. Plenty of humor threaded throughout the talk.

Nervous as hell when the day finally came. I have no problem speaking to a group, love it in fact. But talking cybersecurity to non-technical people is about as boring as it gets. Business owners bought everyone lunch and we met in the conference room.

Timed it to run for 40 minutes, left space at the end for questions. Talk about a resounding success! Everyone in the room was engaged and had questions, some even staying beyond the allotted hour. Fuck me, I actually got applause! (Yes, and everyone clapped. Really.)

Phishing tests went from 25% failure to 4% failure overnight. I left a USB drive on the floor by the printer. No one touched it for three days, and then only to place it on the table.

My next job was at a software dev. Security training involved cutsie animated characters and multiple choice questions. Yeah, a live puppet show would have been more effective.

Jumi@lemmy.world on 02 Nov 21:23 next collapse

A good teacher builds their lessons around their pupils.

shalafi@lemmy.world on 04 Nov 02:05 collapse

This was before I watched Paul Harrell (RIP) on YouTube. Gun content, take that as you will. But the man was a masterclass in how to present information.

Tell 'em what you’re going to tell 'em. Tell ‘em. Tell’ em what you just told them.

Never once talked down to anyone, except for “so called experts”. Never assumed the audience knew specific things. Always showed examples and tests, with controls. Always spelled out any inexact differences in testing, no matter how small. Sprinkled in some dry humor, often unexpectedly. Anyone who teaches could learn from the man.

Jayb151@lemmy.world on 04 Nov 01:43 next collapse

Hell ya. I’m glad you feel really proud about that. I’ve lead so many garbage trainings, it makes the great ones really stand out!

shalafi@lemmy.world on 04 Nov 02:01 collapse

Thank you! I AM proud! It’s one of the finest things I’ve accomplished in the corporate world, and actually useful.

driftWood@infosec.pub on 04 Nov 02:25 collapse

The dedication to your task is commendable 👏. This is becoming rare day by day.

Baggie@lemmy.zip on 03 Nov 22:46 next collapse

Fond memories to my last company, where every email had its links obscured in the email client, so you couldn’t even tell where they led before you clicked on them.

Jayb151@lemmy.world on 04 Nov 01:39 collapse

I never understood this

No1@aussie.zone on 04 Nov 00:08 next collapse

Stupid people are gonna stupid.

HubertManne@piefed.social on 04 Nov 01:01 next collapse

Who’d of thunk you should maybe pay better and invest in quality employees.

Rooster326@programming.dev on 04 Nov 02:02 collapse

Is it the quality of an employee?

My boss makes double what I do. His boss - triple. The CEO speaks of record profits, and HR says we can’t afford raises.

I literally could not care less if my company gets hacked.

corsicanguppy@lemmy.ca on 04 Nov 01:14 collapse

It’s weird how private email and verified senders are a problem solved like 20 years ago. And we still can’t figure it out?