DeepSec 2011 Focus: Usable Security

René Pfeiffer/ June 13, 2011/ Administrivia, Conference

A few days ago we uploaded the keynote speech held by Matt Watchinski at DeepSec 2009. The title was: „Technology Won’t Save You, Only People Will“ This statement can be turned into the opposite: Technology won’t threaten you, people will. We’re not talking about threats from insiders turned rogue. We are talking about holes in your defence because of  badly configured or mishandled security devices and software. This has nothing to do with being Bastard Operator from Hell and putting the blame on the users or colleagues. A modern company infrastructure has to deal with a lot of  complexity all by itself. Adding security won’t reduce this complexity. Adding badly designed user interfaces (for security devices and options), confusing status/error messages and hardly comprehensible settings will most certainly increase the risk of security incidents. Let’s face it, we all make mistakes, so let’s think about how to deal with them and how to avoid them.

We’d like to address the topic of usable security at DeepSec 2011. This aspect not only touches the IT staff, it affects the users as well. It’s fine to talk about raising awareness for security risks, but how do you convey the message? What should users actually do when they know enough about identifying risks? If you’re going to educate your users, your have to talk about risks and remedies alike. There’s no point in leaving the most important half of the lesson in the dark.

So if you

  • have ideas about (complete) security awareness tutorials/programmes,
  • develop software and design UIs (user interfaces),
  • have made a lot of configuration mistakes and know about the implications,
  • understand the psychology of IT staff/users
  • and you want to discuss your ideas/findings on stage,

then let us know! Submit your ideas to our Call for Papers!

Share this Post

About René Pfeiffer

System administrator, lecturer, hacker, security consultant, technical writer and DeepSec organisation team member. Has done some particle physics, too. Prefers encrypted messages for the sake of admiring the mathematical algorithms at work.

2 Comments

  1. One practical point to consider: we know that if security technology is designed by obeying standard usability design guidelines (which sadly enough is not the case most of the time) – even then the security tasks from the end-users (which are secondary by definition) can hardly be accomplished.

  2. Why is so hard to design secure and usable systems?

    Usability is often the weakest chain in the development of secure applications, mostly because usability is still a relatively poorly understood element of IT security. One of the main reasons for this lack of understanding is that usability and security frequently pursue different goals. For example, security standards stipulate that passwords should be long and should be changed on regular basis, which contradicts usability criteria according to which it is better to use passwords which can be remembered easily, thus reducing the cognitive load of users (Nielsen,2010). Conflicts between usability and security can be resolved by examining and understanding metal models of security. Our research as well as other studies cited in the literature, show that users still have poor mental models of security and are rarely able to distinguish between security and privacy. Nonetheless, understanding of the current mental models can lead to improvements in both security and usability in two ways; firstly, systems could be designed in a way which would take the incorrect or incomplete mental models of users into account. The main benefit of this approach is that it can be adapted promptly, since it is easier to adjust the system to users´ mental models than the other way around. Secondly, a system could help users to develop correct mental models by giving them constant feedback about its status and the meaning of the actions performed.
    Nielsen, J. (2000). Security and Human Factors. 2000-11-26. Online Source: URL: http://www.useit.com/alertbox/20001126.html.

Comments are closed.