If your organization is like most, you probably are required to change your computer passwords every six months as a precautionary measure. It turns out, however, that changing your passwords every 180 days actually lessens cybersecurity.
That counter-intuitive advice is from Dr. Jennifer Golbeck, a computer scientist, director of the Social Intelligence Lab (Human-Computer Interaction Lab) and an associate professor in the College of Information Studies at the University of Maryland, College Park. Golbeck is a featured speaker at the AFP Annual Conference in October. Learn more about this session here and be sure to register for the AFP Annual Conference by September 16 to save $200.
Golbeck recently spoke with AFP:
AFP: You hold a doctorate in computer science and your focus is on understanding how people use social media to improve the way they interact with information. So how did you get into cybersecurity? How does that relate to human interaction?
Jen Golbeck: That’s a good question because it was a kind of crazy route. I study a lot on social media and I really use artificial intelligence to analyze the data people post on online to help them find more of what they want, whether it’s recommending movies or music or filtering things out. And along that route, privacy kind of regularly would rear its head as an important issue because it turns out there’s so many things that we can find out about people from these seemingly anemic digital traces that they leave behind.
And so, as I continued with my work, I started talking more and more about the importance of privacy and how you protect it and that kind of led me into the space of, you know, here is the security tool that you can use to do that, and some of these security tools are really terrible for people to use. They’re just hard for people to figure out. So, it kind of launched this second parallel research area for me that kind of focused on privacy and security, making security systems easier to use. You know, if you spend a little time thinking about it, it kind of makes sense how that neatly overlaps with working with how people use social media.
AFP: You argue that with cybersecurity, the problem is not people behaving insecurely but security systems that are designed with no concern for their users. Aren’t people responsible for their own actions, whether it’s acting lawfully, or protecting their information from hackers?
Jen Golbeck: I think people absolutely are responsible for that but the fact is that we have a certain set of abilities and limitations and we’re often pushed well beyond those by the systems that we have to use. So, if you think for example about passwords, right, that’s the most common security thing that we all have to do, and the standard rules for passwords are eight characters or the upper case-lower case-digit combination. And if you look at cognitive science, it says people can only remember seven things on average in their short-term memory on average. So, some people can only hold fewer things than that. So, if you’re coming up with a really random properly complex password with eight characters, it’s already too hard for you to remember in your short-term memory.
So, then what do you do? Well, you either make it less secure by picking something that’s a little bit easier for a hacker to get, maybe you write it down, but it taxes what human brains are capable of doing to try to make people memorize passwords that way. And there’s a lot of little things like that. And human-computer interaction as a field of research spends a lot of time looking at psychology science, cognitive science, and then saying, okay, how we minimize the cognitive load, the work that people have to do in order to accomplish a task. And security systems don’t really do that. In fact, if you get a degree in cyber security, you never have to take any class that has anything to do about humans and how they use systems. And so, we have these systems that are designed with no understanding of human abilities or how you design systems that are easy to use and as a result they’re really hard to use.
They also can just because of that get in the way of people doing their jobs. So, one example that I give a lot is hospital that would auto log out these patient data terminals that doctors would use because doctors were walking away and leaving them logged in but the doctors shouldn’t leave them logged in, but the auto log-out feature was logging them out like in the middle of patient exams which is really disruptive to the actual job that doctors have to do so they started circumventing the auto log-out feature. So, is that doctors not taking responsibility for their security? Maybe. I’d say it’s really them trying to do their job and the security was getting in the way. And so, I think the right solution to this is we have to have security systems that are designed around what people can do and kind of with the knowledge and attention towards what their actual job is which is not security, and then it’s much easier for people to behave in a secure way because they’ don’t have to spend a lot of effort investing in the security itself.
AFP: What would a human-centered cybersecurity focus look like?
Jen Golbeck: There’s a couple of different types of people that you have coming in to work on cybersecurity, right? You have people who are dealing with encryption standards, who are really working on the backend, your database security, stuff that no regular user ever interacts with, right? Like I never touch the databases at the university, for example. And those sorts of people don’t really have to deal with this issue. They can work on the algorithms and the encryption standards and all of that. There’s nothing that has to go on there. But a lot of these systems are forward facing. Some of that comes through IT support, some of that comes through policy within the organization, and some of it comes through the design of the interfaces that are facing your users.
And so, I would say an organization should, one, make sure that they have someone who’s trained in some human factors or human computer interaction space on that staff. So, that could be they’ve taken the same classes in it, there’re people who have degrees in that who also are computer scientists and well trained in security. But those people are going to have a background in design, all these issues of cognition, understanding task, how to build things that face users [sounds like]. And I think ultimately you first want to design the system so it’s good for the people and then figure out how to make the security work in that context. So, that’s thing number one. Get someone who’s trained on this to actually help you build those systems that people are acting with.
The thing number two is that organizations are very resistant to changing policies, which makes sense, right, it’s risky to change policies. But there’s a lot of evidence out there about security practices that shouldn’t exist that are widely deployed that organizations don’t seem to want to change and the biggest one of these is that policy where you have to change your password frequently.
So, at the University of Maryland we have to do it every six months. Lots of organizations have it, you’re required to change your password every certain number of months. And evidence says that that actually makes your systems less secure because of the people. You can say, well, theoretically it’s harder to hack a password because even if someone gets your old one, they can’t get in if you changed it. But in practice people do the same kind of thing I do, I put a number at the end and I just increment it or drop it down by one every time I change it because it’s too hard to remember a totally different password every six months. And that’s what people do. So, you actually make your system less secure. They make people pick less secure passwords with that policy.
All the evidence says you shouldn’t make people do this. And yet, I go around and talk about this, I tell people, but the organizations don’t want to make that change. They insist that people have to keep changing their passwords. And that’s the sort of thing that I think organizations really need to take a hard look at. There’s a lot of people doing good academic, well-founded scientific research in this space and when they come out with results to say this kind of policy actually makes things less secure and it also makes things worse for your users, you really want to strongly consider changing that policy. That scares people and a lot of security people don’t like it but it’s the right move and I think that’s an organizational kind of culture and decision that on a lot of levels if you start paying attention to that, here is the thing that make us more secure that are for the humans who are using the system, you end up overall making it better for your people and then as a result making it more secure overall.
AFP: It’s interesting you said that you also fall into the habit of just incrementally changing the number at the end of your password because it’s hard for you as well. So I’m curious, does the University of Maryland, your employer, do they listen to you or do you sit there saying, “Hey, I’m the expert. You’re paying me to be an expert and yet you’re not listening to me.”
Jen Golbeck: It’s the latter. I’ve e-mailed our IT people, I’ve sent them these studies, I said, “By the way, I’m going to talk to the media about the response that you give me, so please give me your response,” and they just totally ignored me. So, we still are changing our passwords every six months despite my concerted effort to at least open a dialogue about it with them. They don’t want to hear it apparently.
AFP: That’s kind of sad to hear.
Jen Golbeck: I know. It’s a little frustrating but I think they probably see me as one of these kooky faculty members maybe who’s on a mission. Hopefully we’ll eventually win them over on this. But, yeah, for now I’m not even getting a response when I ask to talk about it.
AFP: I am wondering if there’s one piece of advice you could tell a treasurer, a CFO, to help make his or her organization’s information more secure—particularly payments information?
Jen Golbeck: I would say the one piece of advice is don’t assume that the way that you’re doing it now is the best way. Which, you know, I think a lot of people would agree with but that comes with the implication that you maybe have to change perhaps dramatically the way that you’re doing things now to make them more secure. And that’s really scary to say you need to throw out all the ways that you’ve been doing security facing your users, whoever they are, and potentially change it.
But it might be necessary. And you know, I think it should be evidence-based decision. There’s a ton of good evidence out there. Hopefully, it will be built on some best practices so you may be taking some new steps. But I think one of the big impediments to making systems more secure where humans are involved is that there are changes that can and should be made by all evidence that organizations just won’t make because they resist changing what they’re doing because they feel comfortable with what they’re doing.And so, I would say be willing to consider that you’re not doing it the best way now and that there could be great evidence that you should change the way you’re doing it and that’s going to make it more secure all around. If you make it more secure for the people, you’ll increase the security of the whole system.