Everyone else is doing it! Social Proof Security Awareness

https://elevatesecurity.com/social-proof-superpower/

I came across this super interesting article about applying peer pressure to your security awareness program. I think if it's done correctly, it could be extremely effective! I mean, I sure wouldn't want to know that everyone else is more educated than I am! What do you think about comparing your users to the rest of the company?
Community Manager at Infosec!
Who we are | What we do

Comments

  • EANxEANx Member Posts: 1,077 ■■■■■■■■□□
    My current employer uses management proof to great effect. We'll have mid-level managers push back hard on any sort of IT security briefings so we get someone a level or two higher to go through it and publicize. We then can leverage that into "If he has time, you have time."
  • Infosec_SamInfosec_Sam Admin Posts: 527 Admin
    That sounds like it would be pretty effective! It might work better than just an email blast that says "90% of the company completed this training." I'll definitely have to steal that if I ever find myself in that kind of role in the future!
    Community Manager at Infosec!
    Who we are | What we do
  • SteveLavoieSteveLavoie Member Posts: 1,133 ■■■■■■■■■□
    edited April 2019
    Well, peer pressure can be very effective, if done right. Otherwise I could lead to a toxic work culture.  In one project, to monitor web usage for the employee, I was displaying the top 10 web page of each person with their number of hit in the lunch room... It helped to maintain personal web surfing at a normal level. In that case, it was a HR disaster, but very effective and so low cost :)   I was much younger, and I learnt a lot lol.
  • LisaPlaggemierLisaPlaggemier Member Posts: 17 ■■■□□□□□□□
    "fun" peer pressure done right = gamification.  I know it's a buzzword right now, but people do like a little friendly competition. It doesn't work with every group, but if the culture is right, it can definitely drive participation.  I've used it with dev groups with good results.
  • Infosec_JordanInfosec_Jordan Member Posts: 1 ■■□□□□□□□□
    I like to try and help figure out what incentives we could give our "champions" that would cause some emotions in other users. They see gratitude and positive reinforcement when learners do the right thing and if you keep on that path you can pull other users into your "champion" group. Now do you need to be mindful yes because employees that feel neglected can become problems so sometimes doing it by department can bring some teamwork into it like Lisa mentioned above.
  • Danielm7Danielm7 Member Posts: 2,310 ■■■■■■■■□□
    Well, peer pressure can be very effective, if done right. Otherwise I could lead to a toxic work culture.  In one project, to monitor web usage for the employee, I was displaying the top 10 web page of each person with their number of hit in the lunch room... It helped to maintain personal web surfing at a normal level. In that case, it was a HR disaster, but very effective and so low cost :)   I was much younger, and I learnt a lot lol.
    Oh yeah I'd get burned alive for even suggesting such a thing. 

    We have security awareness weeks where we make it a fun competition, so the different groups can compete. But really only from a positive standpoint, there isn't shaming of the people who don't do as well. Now, within each team I think they can see points, so if someone gets a bunch of negatives for failing phishing tests during the game window, someone in their team might see that and say something. 
  • Infosec_SamInfosec_Sam Admin Posts: 527 Admin
    Danielm7 said:
    Well, peer pressure can be very effective, if done right. Otherwise I could lead to a toxic work culture.  In one project, to monitor web usage for the employee, I was displaying the top 10 web page of each person with their number of hit in the lunch room... It helped to maintain personal web surfing at a normal level. In that case, it was a HR disaster, but very effective and so low cost :)   I was much younger, and I learnt a lot lol.
    Oh yeah I'd get burned alive for even suggesting such a thing. 

    We have security awareness weeks where we make it a fun competition, so the different groups can compete. But really only from a positive standpoint, there isn't shaming of the people who don't do as well. Now, within each team I think they can see points, so if someone gets a bunch of negatives for failing phishing tests during the game window, someone in their team might see that and say something. 
    I definitely agree that a fun competition is the right way to go for this, and I'm not really a huge fan of shaming the losers either. What I'm wondering is how we motivate the people who refuse to take their training. I'm thinking that's where the peer pressure may be a bit more effective.
    Community Manager at Infosec!
    Who we are | What we do
  • Danielm7Danielm7 Member Posts: 2,310 ■■■■■■■■□□
    I ran a big tabletop exercise last year involving HR, and all the other business groups. I based it off phishing, ransomware, etc. When we got to the "but people just don't complete their training!" part I asked HR what they can do. Silence. How about tying it to their annual raise / bonus? It's not a done topic yet but it's being tossed around. Like want to do your final annual review? Then you need to complete all required training for the year or it's locked. 
  • LisaPlaggemierLisaPlaggemier Member Posts: 17 ■■■□□□□□□□
    @Danielm7 Do you also run phishing simulations?  Sometimes the best place to start is with the folks that are your biggest risk.  I've had more success partnering with HR on security training when we focused together on training for repeat offenders.  HR proposed an escalation process that we partnered on.  We never actually had to get all the way to the end of the process with anyone, so that was a win. 

    Of course we still did annual compliance training, but taking a risk-based approach is easier for the business to get their heads around.
  • cyberguyprcyberguypr Mod Posts: 6,928 Mod
    In my $dayjob we've made excellent progress with a dual approach: positive and negative. On the positive side we do public recognition of a selection of those who pass all tests (not clicking+reporting). We do certificates and even "human firewall" plaques. That was the initial approach but we realized we needed a punitive component to achieve the rates we intended. We then established two programs: Phishing University and progressive discipline approach.

    Phishing University is a small group, hands-on interactive phishing training that we make mandatory for those who click two times in a specified period. First they have to go through a recap of the "how to identify phishing" basics. Then we give them iPads and they to go on the web, research a company, and build a custom phish message based on their findings. You can see people's faces light up when they realize how easy it is to find info on random top executives and craft hooks based on that. This has been wildly successful and some people even voluntarily go through the program.

    On the "negative" side we also implemented a repeat offenders programs. We have left the door open to serious consequences up to and including termination but like Lisa said, so far most people who made it through get the message after HR has a friendly talk with them. 
  • LisaPlaggemierLisaPlaggemier Member Posts: 17 ■■■□□□□□□□
    @cyberguypr   I LOVE the Phishing University and having them build their own phish.  What a fantastic idea! <3 
  • Danielm7Danielm7 Member Posts: 2,310 ■■■■■■■■□□
    @Danielm7 Do you also run phishing simulations?  Sometimes the best place to start is with the folks that are your biggest risk.  I've had more success partnering with HR on security training when we focused together on training for repeat offenders.  HR proposed an escalation process that we partnered on.  We never actually had to get all the way to the end of the process with anyone, so that was a win. 

    Of course we still did annual compliance training, but taking a risk-based approach is easier for the business to get their heads around.
    Yes, we do. The last person who was running them did them a few times a year, I've changed the program pretty significantly to where now we start with a bunch of different templates every month and the users get random phishing messages, dripped out throughout the month. This is only the first month doing this but i've noticed a few interesting things. First a huge reduction in the gopher effect where someone clicks and then pops out of their cube telling everyone else to not click it. We have a reporting button in Outlook that will generate a ticket, but if it's an internal test it'll tell them that and not create the ticket. Historically we've had a lot of people click the phish, and then try to go back and click the reporting button thinking that'll cover them, that seems to be down. Managers emailing their teams telling them to avoid the specific email, also now not as effective. Also, between the timing of it being spread out, and the users getting better at reporting, the helpdesk isn't slammed with questions about the tests anymore. 

    The only thing we really haven't ironed out yet is the HR side, they like to be very hands off so tying almost anything to any sort of consequence is really difficult here. 
  • LonerVampLonerVamp Member Posts: 518 ■■■■■■■■□□
    Nice!

    Of note, that measure of people who click the phish and then report it is a great measure. In fact, that should be a key behavior you want to encourage. Someone gets something wrong with a real phish, you'd like your security team to know about it ASAP, which will only happen with good detection or if the user reports it.

    Your whole story is otherwise almost point for point like mine (I also manage the phishing tests).

    We, unfortunately, have yet to tie failures or even the measure itself to anything but the most basic of compliance training annually from HR/compliance.

    Security Engineer/Analyst/Geek, Red & Blue Teams
    OSCP, GCFA, GWAPT, CISSP, OSWP, AWS SA-A, AWS Security, Sec+, Linux+, CCNA Cyber Ops, CCSK
    2021 goals: maybe AWAE or SLAE, bunch o' courses and red team labs?
  • Infosec_SamInfosec_Sam Admin Posts: 527 Admin
    LonerVamp said:
    Nice!

    Of note, that measure of people who click the phish and then report it is a great measure. In fact, that should be a key behavior you want to encourage. Someone gets something wrong with a real phish, you'd like your security team to know about it ASAP, which will only happen with good detection or if the user reports it.

    Your whole story is otherwise almost point for point like mine (I also manage the phishing tests).

    We, unfortunately, have yet to tie failures or even the measure itself to anything but the most basic of compliance training annually from HR/compliance.
    That was pretty much the same boat that I was in back when I managed our phishing sim. We hadn't been doing it for very long, so we were just starting to get meaningful trends to appear. However, we were really only doing it so we could say we did it - it wasn't tied to any consequences or anything. I'm pretty sure it was just so we could check off another box with our auditors.

    I will say, it was very refreshing to see the number of tickets that would come in when we would send out a phish, though! It felt like the talk of the town for a few days, which was obviously a great sign.
    Community Manager at Infosec!
    Who we are | What we do
Sign In or Register to comment.