A report from the Enigma conference
The 2018 USENIX Enigma conference was held for the third time in January. Among many interesting talks, three presentations dealing with human security behaviors stood out. This article covers the key messages of these talks, namely the finding that humans are social in their security behaviors: their decision to adopt a good security practice is hardly ever an isolated decision.
Security conferences tend to be dominated by security researchers demonstrating their latest exploits. The talks are attack-oriented, they keep a narrow focus, and usually they close with a dark outlook. The security industry has been doing security conferences like this for twenty years and seems to prefer this format. Yet, if you are tired of this style, the annual USENIX Enigma conference is a welcome change of pace. Most of the talks are defense-oriented, they have a horizon going far beyond technology alone, and they are generally focused on successful solutions.
The brief talks are only twenty minutes long with ten additional minutes of Q&A; the conference is clearly trying to emulate TED talks in terms of quality and focus. All speakers go through three separate training rounds where they present their talk to each other remotely to help them polish the talks. Gender and racial diversity plays a big role at Enigma and the conference appeared to have reached something close to gender parity this year; the fact that there were regular queues in front of the female washing rooms was a constant source of comments.
I was attracted to Enigma when I saw the video [YouTube] of Rob Joyce's talk from the first edition of the conference in 2016. Joyce was the director of US National Security Agency's (NSA) Tailored Access Operations department; his presentation was about how to protect against nation-state attackers like him. The talk impressed me with his forceful insisting on basic security practices. So I attended in 2017 and returned for this year's Enigma conference in Santa Clara.
Social influence
It was the research into user behavior that stuck with me this time. A perfect example of a high-quality Enigma talk on this topic was the presentation by Sauvik Das from Georgia Tech. He talked about his research into user behavior based on several user interface tests at Facebook and the adoption rate for various security practices. It is an important topic since the whole industry builds many security systems; companies introduce them into their services but the majority of users ignore them, which leaves their accounts and their data at risk.
First, Das tried to measure social influence on security behavior. He measured the adoption of three Facebook security features: (1) login notifications that inform a user when a login has been performed, perhaps with stolen credentials; (2) login approval, which is Facebook's flavor of two-factor authentication via mobile phone text messages; and (3) trusted contacts that can help with account recovery by identifying a user that has been locked out of their account.
He looked at data from 750,000 Facebook users who had adopted one of these features and 750,000 users who had not started to use any of them. Measuring the social influence was fairly simple: how big was the exposure level among Facebook friends for a given feature? Did 1% of friends use it? 5%? 10%? And so on. At each exposure level, he compared the adoption rate and came to the conclusion that there is a strong correlation and thus social influence. But there was also a surprising finding: early adopters can have a negative effect on the adoption rate of a security feature. If they come across as paranoid or nutty geeks, then their use of a feature will stigmatize it and hinder the further adoption among their friends. But this does not negate the overall conclusion: the more exposure users get to a feature via their friends, the more likely they are to adopt it.
After laying the base in such a way, Das moved on to a series of experiments where he tried to influence the adoption rate of 50,000 Facebook users by varying the exposure to social influence. He would present them with an announcement text introducing the security features. He varied the wording of the announcement text and looked into changes of the adoption rate. One group of people would get the plain announcement. Other groups were informed about the absolute number of friends using a feature; others got a percentage of friends.
There was a positive effect with all the announcements and any form of social information improved the adoption rate even if the increase was not dramatic. The click-through rate of the announcement grew by around 30% and the adoption rate by about 10% if the announcement cited the raw number of friends using a particular feature, which was the experiment that performed the best. 10% is not all that dramatic, but if you put it in perspective with the engineering costs of a feature and the relative simplicity of sharing a bit of information about the adoption rate among a user's peers, then the 10% of additional adoption based on a single announcement has a reasonable return on investment in terms of improved security.
Taking a step back, Das named some characteristics that help users with their security practices. Security systems need to be observable so that people can actually see and emulate the good security practices of their friends. In addition, security should allow people to act as stewards. This is based on the interesting observation that people do not care too much about their own security but they are very concerned with the security of their loved ones. Designing security systems that allow people to act on these concerns is therefore desirable.
Staff security
Das's presentation was part of a series of talks that covered security behavior at the conference. Another example was Masha Sedova's talk that looked into the ways companies can improve the practices of their staff. It would seem that awareness training sessions don't go very far and their effect is quickly lost. Otherwise, the last smoker would have quit by now. But the motivation to quit smoking — or use strong passwords — just is not there, no matter the awareness. Sedova had the numbers to back this up. She came to the conclusion that knowledge is not enough. It also takes motivation, the ability to adopt a better behavior, and then a trigger to actually start.
The harder it is to do something, the higher the motivation needs to be to change a behavior, start a new habit, etc. In practice this means that simplifying the steps to adopt a secure behavior is essential. The secure way to perform a task should always be the easiest way perform it. And adoption will definitely follow.
Where this is not possible, you need motivation. Yet motivation does not always come by itself. She named two typical situations that tend to grow motivation; one is a scheduled audit of a team or a project where people grow more and more motivated the closer the effective date gets. After the audit, the motivation disappears rapidly. The second situation is a security incident that immediately brings a big motivation to change the security practices. Yet this motivation does not stick, it slowly degrades as time goes on.
How can you influence this motivation as a company? It definitely pays to work with these events. Use them to your advantage when trying to introduce new practices. Sedova quickly ruled out money or punishments as an effective means of motivating your staff with few exceptions. For the majority of people, you need positive reinforcements as a way to create intrinsic motivation. Here she named public recognition of good performers, competition in company events, altruism and feedback, access to special events or exclusive swag, etc. as useful elements of a motivation program within a company. Obviously, this is all much more complicated than a simple technical solution that helps using a system in a secure way.
Account recovery
Grzegorz Milka illustrated the problem of offering technical solutions that are not adopted, using observations of account takeovers and recovery at Google. Accounts at Google are constantly under attack; the company goes to great lengths to protect account holders and to simplify recovery. His talk immediately made headlines in various tech outlets with the observation that 10% of Google accounts have two-factor authentication enabled. Some were stunned it was such a low number, while others were astonished that was so high. Whatever your perspective, what I thought more interesting was the general impression that Google has put a variety of methods into practice to prevent an account from being hijacked.
It's a risk-based, in-depth strategy that is complex and tries to keep up with the evolution of the account hijacking business. Devices are fingerprinted, the location of a login is taken into consideration and the known usage patterns of an account are part of the equation too. These observations do not stop once the account is recovered; the user behavior is constantly being monitored and abnormal behavior can trigger a lockout to prevent data loss or spreading to the contacts of a user via phishing scams.
The example he noted was a login from a new device, then a look at the address book, and then an attempt to erase all contacts. This resembles an account takeover where the attacker copies the address book for future attacks and immediately tries to make sure the account owner can no longer warn all his contacts. Google monitors the behavior of the user to try to detect malicious activity. This is necessarily far more complicated than simply providing two-factor authentication.
If you have a simple technical solution to a security problem, that is great. But if humans are not adopting it, then things get complex quickly. Simple technical solutions will no longer do the job and you need insight into social factors in order to understand human behavior and build your systems accordingly.
[I would like to thank the Swiss Cyber Storm conference for paying for my trip to USENIX Enigma.]
| Index entries for this article | |
|---|---|
| Security | Conferences |
| Security | User behavior |
| GuestArticles | Folini, Christian |
| Conference | USENIX Engima/2018 |