Search Results

Share Article:

Facebook Twitter Linkedin Mail

How to patch a human: our cyber security influence explained

Cyber Security Business tips

Posted on March 22, 2019

1 min read

Cyber security preparedness is built on three pillars: people, processes, and technology. While technology is a critical element of an effective cyber security program, alone it is not enough to protect against modern cyber threats.

It’s not only hackers, corporate spies, or disaffected staff who present a threat to organisations; in most cases, breaches are often unintended consequences to mistakes made by non-malicious, uninformed employees.

In the Office of the Australian Information Commissioner’s 1 July — 30 September 2018 and 1 October — 31 December 2018 reports, it listed human error as a major source (37 and 33 percent respectively) of reported breaches.

While the largest source of reported breaches (57 and 64 percent) was attributed to “malicious or criminal attack”, a significant proportion of these exploited vulnerabilities involving a human factor, such as tricking employees to click on a phishing email or to disclose their passwords.

These figures illustrate the fundamental role security awareness can play in an organisation’s cyber security defences, and how a strong security culture can act as a ‘force multiplier’.

It’s not enough to simply know all about security, we also need specialists who understand humans and what makes them tick.

Awareness doesn’t work

There is a view from some in the security community that awareness programs don’t work, and therefore technical solutions should be pursued over human ones.

But the answer is not to stop investing in these programs. Instead, we need to recast them based on an understanding of why they may not be working.

A traditional approach to awareness might be to force feed information annually to busy staff and then have them brute force their way through a multiple-choice questionnaire, or to publish long, dry policy documents and advisories on an intranet site.

The flaws in these approaches should hopefully be obvious to most.

The real problem with awareness, though, is that it is often a means for the security team to project its priorities on staff – what staff should know, and how staff should consume that information – instead of first considering the needs of staff, specifically, what is driving behaviour and what is the most effective way to engage them?

From awareness to influence

What is needed is a shift in mindset: from awareness to influence. Influence – which is ultimately the desired outcome of an awareness program – requires a deep understanding of your customer. I need more than subject matter expertise to effectively influence behaviour: I also need to understand your needs and motivations. For a security team, this requires domain expertise in human behaviour.

The skills required to configure and effectively manage a security system are not the same as the skills required to understand and influence human behaviour. While technical skills are of course critical to building any good security team – and the effectiveness of Telstra’s Cyber Influence team is entirely dependent on the expert technical skills and knowledge we have available to us – having technical experts lead human programs will likely not yield the desired results.

It’s not enough to simply know all about security, we also need specialists who understand humans and what makes them tick.

The same can be said of information versus intelligence. Any expert worth their salt can provide information. But for it to be considered intelligence – noting intelligence must inform decision-making – it needs to be relevant to the audience and it must be actionable.

Cyber security influence explained

A working paper published by the Global Cyber Security Capacity Centre, titled Cyber Security Awareness Campaigns: Why do they fail to change behaviour? argues that “end users [already] know about the dangers. Security experts have warned them, confused them, and filled them with fear, uncertainty and doubt. People base their conscious decisions on whether they have the ability to do what is required and whether the effort will be worth it.” Achieving this type of lasting change requires us to provide advice that is understandable, relatable and – importantly – engaging.

At Telstra, this has meant working with creative agencies, for example, to develop story-driven video content more akin to a Netflix series than a traditional talking-head training video – complete with its very own trailer below. Staff who viewed this video miniseries have shared it with friends, family, and colleagues in an amplification of our security message driven not by a request to do so, but because the content is engaging and relatable.

This is influence. Our story has been memorable, shared, and impactful. Staff have responded to our avoidance of information-heavy content and retained and shared important key security messages.

Human problems require human solutions

Security education needs to be more than simply providing information to people; it must be targeted, actionable, and achievable, with simple consistent rules of behaviour that people can follow. It must be relevant to their roles and interests, free of jargon, and composed of plain language, case studies, metaphors and allegories to explain the why (not just the what), provide context and build a conceptual understanding for the audience.

The one form of communication that has stood the test of time is storytelling. Good stories are shared and retold, while mnemonics and slogans – like Telstra’s Five Knows of Cyber Security or the Australian Cyber Security Centre’s Essential Eight – help make information memorable. Long lists of instructions or facts, however, are discarded almost as quickly as they were relayed.

Stories which Telstra’s Cyber Influence team tell often demonstrate how an empowered mindset of scepticism and gut feel trump rote recall of static security facts (such as ‘phishing emails have typos’, or ‘look for sites with SSL’). Relying solely on static facts is a particularly perilous approach, since the landscape is in a constant state of flux. Such facts can be true one day and no longer applicable the next – for example, over 50% of phishing sites now have a valid SSL certificate.

Stories can be enhanced by actions that simplify the complexity of ‘doing’ security. For example, rather than ask staff to select passwords that contain 12 uppercase, lowercase, numerical, and special characters – which is confusing and difficult advice to follow – a simplified approach is to explain the concept of a passphrase which is both easy to remember but hard for criminals to guess, or to recommend password managers to maintain strong, unique passwords across all accounts.

Simply telling people to do something isn’t enough. Sustained culture change takes time but, with the right expertise and focus, it can lay the groundwork to ultimately weaponise what is simultaneously the biggest strength and weakness in most companies’ cyber security posture: their people.

To learn more about cyber influence versus awareness, and to learn how we’re building a model for success, you can read Blair’s full article on Medium.