We all probably have behaviours that we could look to change; whether it be eating more healthily, walking rather than driving or making our passwords impenetrable. So why don’t we? Should we take on the full burden of our failures or promise that we’ll try harder next week when it’ll all be different? Or rather than considering what will make us do something next week that we’ve failed to do for countless previous weeks, should we maybe wonder what prevented us from doing these things in the first place?

Psychologist Kurt Lewin, who developed the Force Field Analysis Model  that sees human behaviour as, broadly speaking, being subject to brakes and accelerators (1), focussing on the barriers to completion. The basic principle being that, certain ’forces’ prevent us from doing things and others get us to do things. Professor Daniel Kahnemann, one of the founding thinkers of the now well-known nudge theory, described it as ‘the best idea I ever heard in psychology’ (2); with that considerable billing, we hope we can do the idea justice.

Before you think you’ve stumbled upon the advice column in a lifestyle magazine offering premature tips about how to keep New Year’s resolutions, we will pivot to the context of cyber security. The rapid rise in cyber security breaches over the last few years has been stark and we know that a large proportion of these are likely down to human mistakes.

Stanford University Professor Jeff Hancock and security firm Tessian produced a report in 2020 looking at ‘The Psychology of Human Error’, which stated that 85% of breaches were caused by human error (3). The statistics are hard to deny, but the cause may be less clear cut.

Historically, companies rely heavily on a constant drone of email reminders or linking compliance to remuneration when dealing with cyber security issues and training. To use the example of Kurt Lewin’s theory, they tend to focus on ‘accelerators’. It’s a tempting idea that this alone can make things better, but people were aware of these before, so why did they still act the way they did?

Professor Kahnemann advises that “When you want to influence someone’s behaviour, you can either push them, or you can ask the question ‘Why aren’t they doing it already? (4)'

To illustrate this concept, a study was carried out where, in a bid to encourage people to install thermal loft insulation, people were offered either a free supply and installation of loft insulation, or they could pay several hundred dollars for the privilege of having the items cleared and put back in the loft alongside the installation. Uptake for the second service was much higher, despite the considerable cost.

The ‘brake’ in this scenario was not the cost, but rather the dread of clearing out a cluttered loft.

Likewise, the reason people don’t change their password or pay due attention to a phishing e-mail may not be for want of ‘accelerators’ in the shape of training and reminders, but that there is a ‘brake’ preventing them from doing this. The reasons for each individual in each company will differ.

Professor Kahnemann elaborates further on this, laying the blame on a concept known as ‘fundamental attribution error’. This is where people underestimate the impact of an environment on the behaviour of an individual (5). It is key to understanding why people don’t carry out a certain action, even though they know they should. In most work places, an employee’s priorities once sat at a desk with a computer open will inevitably revolve around the day job . Busy schedules and online meetings don’t leave much room for a focus on cyber security compliance. Viewed through this lens, omissions can become much more understandable. It is typically not the desire to be slack on security compliance but rather the environment which doesn’t lend itself to a focus on compliance of security.  

Perhaps, rather than thinking about how to enforce greater compliance, IT security professionals should consider what in the current environment is preventing them from doing so already? In other words, release the brake rather than press the accelerator.

We are all aware that remembering numerous secure passwords is almost impossible, relying on a single password heavily used or a random number placed on the end of a familiar word. Rather than peppering employees with more training around secure passwords or requiring increasingly complex combinations, they are moving to a ‘passwordless world’. This would rely more on One Time Passcodes (OTPs) as favoured by banks with their customers, among other measures, in place of passwords. In the spirit of Kurt Lewin’s work, they are removing the brake on employees having secure login credentials.

Inevitably some organisations will remain solely focused on ‘accelerators’ and battle on in the hope that another round of e-mails or warnings will finally change behaviour. A tempting sentiment, but not one that reflects the general nature of human behaviour.


(1) https://www.ifm.eng.cam.ac.uk/research/dstools/force-field-analysis/

(2) https://notes.misentropy.com/post/621443284303314944/accelerators-vs-brakes-kurt-lewin

(3) https://www.tessian.com/research/the-psychology-of-human-error/

(4) https://notes.misentropy.com/post/621443284303314944/accelerators-vs-brakes-kurt-lewin

(5) See: https://www.techtello.com/fundamental-attribution-error/ and https://www.simplypsychology.org/fundamental-attribution.html