How to patch a human: our cyber security influence explained
Cyber security preparedness is built on three pillars: people, processes, and technology. While technology is a critical element of an effective cyber security program, alone it is not enough to protect against modern cyber threats.
It’s not only hackers, corporate spies, or disaffected staff who present a threat to organisations; in most cases, breaches are often unintended consequences to mistakes made by non-malicious, uninformed employees.
In the Office of the Australian Information Commissioner’s 1 July — 30 September 2018 and 1 October — 31 December 2018 reports, it listed human error as a major source (37 and 33 percent respectively) of reported breaches.
While the largest source of reported breaches (57 and 64 percent) was attributed to “malicious or criminal attack”, a significant proportion of these exploited vulnerabilities involving a human factor, such as tricking employees to click on a phishing email or to disclose their passwords.
These figures illustrate the fundamental role security awareness can play in an organisation’s cyber security defences, and how a strong security culture can act as a ‘force multiplier’.
Awareness doesn’t work
There is a view from some in the security community that awareness programs don’t work, and therefore technical solutions should be pursued over human ones.
But the answer is not to stop investing in these programs. Instead, we need to recast them based on an understanding of why they may not be working.
A traditional approach to awareness might be to force feed information annually to busy staff and then have them brute force their way through a multiple-choice questionnaire, or to publish long, dry policy documents and advisories on an intranet site.
The flaws in these approaches should hopefully be obvious to most.
The real problem with awareness, though, is that it is often a means for the security team to project its priorities on staff – what staff should know, and how staff should consume that information – instead of first considering the needs of staff, specifically, what is driving behaviour and what is the most effective way to engage them?
From awareness to influence
What is needed is a shift in mindset: from awareness to influence. Influence – which is ultimately the desired outcome of an awareness program – requires a deep understanding of your customer. I need more than subject matter expertise to effectively influence behaviour: I also need to understand your needs and motivations. For a security team, this requires domain expertise in human behaviour.
The skills required to configure and effectively manage a security system are not the same as the skills required to understand and influence human behaviour. While technical skills are of course critical to building any good security team – and the effectiveness of Telstra’s Cyber Influence team is entirely dependent on the expert technical skills and knowledge we have available to us – having technical experts lead human programs will likely not yield the desired results.
It’s not enough to simply know all about security, we also need specialists who understand humans and what makes them tick.
The same can be said of information versus intelligence. Any expert worth their salt can provide information. But for it to be considered intelligence – noting intelligence must inform decision-making – it needs to be relevant to the audience and it must be actionable.