Better security by design
Published: November 24, 2017
Human error is one of the toughest things to guard against when planning digital security. It’s the single biggest attack surface in digital systems. And yet, security and user-experience (UX) design are generally not considered in tandem — in fact, security and usability are sometimes seen as enemies. That needs to change.
The emergence of cross-functional development teams, in particular, demands security and UX should sit together. Neither design nor security should be add-ons or afterthoughts to the development process.
The release of the Ìýpresents a good moment to consider how design and security can work together to reduce risk. OWASP formed as an independent, open space to raise awareness about digital security threats and help improve everyone’s defenses. Their Top 10 is a list of the current most critical web application security risks.
When your organization addresses the security vulnerabilities identified by OWASP, it's a good moment to involve your designers and usability experts in the conversation, as well as your security experts. OWASP recommends finding "natural opportunities to gather security information and feed it back into your process." The same goes for design.
Not all of the OWASP recommendations have usability implications, but a few key ones do. Here are a few ways in which software development teams can involve designers when addressing security concerns, as well as things designers should know to help keep their users and their data safe.
Ìý
Limiting the length and kind of text that can be entered into a field helps protect against attack. Knowing some characters need to be refused or escaped by your entry fields also helps.
To prevent XSS, consider what precautions need to be taken when rendering user input in the browser. Freeform fields like comment fields, forums, and search fields, along with uses of JavaScript and calls to databases or other services should be given particular attention.
Because authentication rests so heavily on human factors, it’s important to include user experience designers in developing user flows for password creation and management, two-factor authentication, and login/logout. Designers can apply to reduce user errors, ensure interfaces express clearly to users what the system is doing, and limit the burden on users' memory.
Designers should keep in mind that login and password management are not the pages where they should get innovative. Be familiar with well-tested best practices in authentication design. OWASP provides guidelines on designs for , supporting users in , , and .
The Security, Privacy, and Abuse team at Google has also established a number of best practices which they’ve validated through ; some can be found . And the CyLab Usable Privacy and Security (CUPS) group at Carnegie Mellon has produced a large body of research on how to support users in choosing .
There are also some features you can include in your authentication screens to support better security practices. Paste and drag-and-drop should be enabled so that users can enter passwords from their password safes. Password safes enable users to create unique, strong passwords that would otherwise be impossible for them to remember.
By contrast, advocates for user privacy and security suggest casting a somewhat broader protective net when considering what is "sensitive." Location (including GPS coordinates and IP addresses), contact or friend network information and browsing history are other categories of information which need special consideration in order to protect users, particularly vulnerable ones. IP addresses cannot be logged in the EU without user permission, as they could be used to identify users' online activity. Because of this, it is important to think about potential risks when you log IP addresses in apps or metrics.
When considering user needs, you may want to make use of , which run through how best to protect sensitive data. Among the use cases in that deck are journalists, survivors of domestic abuse, and LGBTQ individuals. Consider:
OWASP points to old, defunct pages as a major source of this vulnerability. To get a clear picture of where inappropriate access might happen, it is important to involve UX staff to fully sketch out user flows related to access permissions, and get rid of pages which are no longer needed. Be sure to include edge cases and both happy and unhappy paths when you outline what screens users will and won’t be able to access.
Designers should also be aware of the role of design patterns in limiting user access and be sure to use them when designing security-sensitive interfaces. For example, drop-down menus can be limited to only display the options a given user is allowed to access.
Writing matters, too. An overly explicit error message when someone tries to access a page can give away details about authentication (for example, "Your username is your email address") and make it easier for criminals to target users of your system. If you're working on information architecture or SEO, think about weaknesses in your URL structure. For example, ways in which someone might manipulate the URL to increment a pattern or otherwise guess their way into access they shouldn't have.
Wording on error messages and warnings is also critical to ensuring that users understand what is going on. It can ensure users don't just dismiss a warning because it is too wordy or technical.
See, for example, the and less-avoidable for users. Their A/B testing of how warnings were designed and worded led to a huge decrease in the number of users who allowed their browsers to access malicious sites, from 70% in 2013 down to 33% (Windows) and 17% (Android) in 2017. The EU has also published a study on .
OWASP notes that it may be useful to automate scanning for security misconfiguration or unused/unnecessary services. Designers and developers may want to work together to develop flows which periodically present options for users to update their security configurations and service permissions. Google, Facebook, and Twitter have all developed good examples of how to periodically encourage users to review and update their permissions, as well as their backup authentication credentials.
User testing, as well as design review, is critical to ensure interfaces, instructions, and other messages aren’t confusing or frightening to users. Including UX personnel and end users in your security reviews will ensure that the recommendations from security reviews are implemented in ways that don’t frustrate users, cause them to stop using the tool, or make bad security decisions in spite of developers' best efforts.
The emergence of cross-functional development teams, in particular, demands security and UX should sit together. Neither design nor security should be add-ons or afterthoughts to the development process.
The release of the Ìýpresents a good moment to consider how design and security can work together to reduce risk. OWASP formed as an independent, open space to raise awareness about digital security threats and help improve everyone’s defenses. Their Top 10 is a list of the current most critical web application security risks.
When your organization addresses the security vulnerabilities identified by OWASP, it's a good moment to involve your designers and usability experts in the conversation, as well as your security experts. OWASP recommends finding "natural opportunities to gather security information and feed it back into your process." The same goes for design.
Not all of the OWASP recommendations have usability implications, but a few key ones do. Here are a few ways in which software development teams can involve designers when addressing security concerns, as well as things designers should know to help keep their users and their data safe.
A1 – Injection and A7 – Cross-Site Scripting (XSS)
Injection is when an attacker gains the ability to run commands on a site using an otherwise innocent place to enter text — for example, entering a malevolent code into a form on a website. Cross-site scripting is another kind of injection attack. It refers to attacks which exploit vulnerabilities via user-entered content that is displayed back to a user on a page. OWASP rates both of these attacks as very easy to accomplish, and they’re made common by software which does not restrict potentially harmful input.Ìý
Designers can minimize the risk of injection attacks through understanding how the entry fields they create can be misused.
Limiting the length and kind of text that can be entered into a field helps protect against attack. Knowing some characters need to be refused or escaped by your entry fields also helps.
To prevent XSS, consider what precautions need to be taken when rendering user input in the browser. Freeform fields like comment fields, forums, and search fields, along with uses of JavaScript and calls to databases or other services should be given particular attention.
A2 – Broken authentication and session management
OWASP's #2 listed threat is user-centric — and not in a good way. Authentication, or ensuring that users are who they say they are, is a perennial security challenge. This is mostly because of the limits of the human brain. Password systems end up encouraging users to reuse, write down, or choose weak passwords, lest they forget them. Encryption keys are so long they're impossible for a human being to remember.Because authentication rests so heavily on human factors, it’s important to include user experience designers in developing user flows for password creation and management, two-factor authentication, and login/logout. Designers can apply to reduce user errors, ensure interfaces express clearly to users what the system is doing, and limit the burden on users' memory.
Designers should keep in mind that login and password management are not the pages where they should get innovative. Be familiar with well-tested best practices in authentication design. OWASP provides guidelines on designs for , supporting users in , , and .
The Security, Privacy, and Abuse team at Google has also established a number of best practices which they’ve validated through ; some can be found . And the CyLab Usable Privacy and Security (CUPS) group at Carnegie Mellon has produced a large body of research on how to support users in choosing .
There are also some features you can include in your authentication screens to support better security practices. Paste and drag-and-drop should be enabled so that users can enter passwords from their password safes. Password safes enable users to create unique, strong passwords that would otherwise be impossible for them to remember.
A3 – Sensitive data exposure
OWASP recommends that companies identify the data they store, transmit, and process that requires extra protection. Credit card numbers, passwords, personally identifiable information, and health records are among the categories that need special attention. Encrypting such data, or not storing it in the first place, can help protect users.By contrast, advocates for user privacy and security suggest casting a somewhat broader protective net when considering what is "sensitive." Location (including GPS coordinates and IP addresses), contact or friend network information and browsing history are other categories of information which need special consideration in order to protect users, particularly vulnerable ones. IP addresses cannot be logged in the EU without user permission, as they could be used to identify users' online activity. Because of this, it is important to think about potential risks when you log IP addresses in apps or metrics.
When considering user needs, you may want to make use of , which run through how best to protect sensitive data. Among the use cases in that deck are journalists, survivors of domestic abuse, and LGBTQ individuals. Consider:
- Will anyone be seeking to physically harm users of my software? How can I help protect against that?
- Am I safely designing for multi-user use cases? To what extent will threats to these users come from those who are closest to them, maybe even in their own households?
- How might the data I collect have unexpected consequences? Consider, for example, that one husband found out his wife was pregnant by looking at the blood pressure measurements on her activity tracker
- How might the settings of my app have unexpected consequences? For example, someone locking down an IoT thermostat so their spouse cannot control it
- What are the possible risks of exposing who someone knows? What are the risks of exposing where they currently are?
- What information would these users want to selectively protect or display about themselves, in which situations?
A5 – Broken access control
Broken access control allows users to access parts of a system they shouldn’t be able to—for example, an employee from outside of the human resources department being able to view employee records.OWASP points to old, defunct pages as a major source of this vulnerability. To get a clear picture of where inappropriate access might happen, it is important to involve UX staff to fully sketch out user flows related to access permissions, and get rid of pages which are no longer needed. Be sure to include edge cases and both happy and unhappy paths when you outline what screens users will and won’t be able to access.
Designers should also be aware of the role of design patterns in limiting user access and be sure to use them when designing security-sensitive interfaces. For example, drop-down menus can be limited to only display the options a given user is allowed to access.
Writing matters, too. An overly explicit error message when someone tries to access a page can give away details about authentication (for example, "Your username is your email address") and make it easier for criminals to target users of your system. If you're working on information architecture or SEO, think about weaknesses in your URL structure. For example, ways in which someone might manipulate the URL to increment a pattern or otherwise guess their way into access they shouldn't have.
A6 – Security Misconfiguration
As with authentication, designers can ensure users make the best decisions for security configuration. Many of these overlap with best-practices design heuristics. For example, providing users with good, safe defaults is important. This includes not shipping with easily guessable default passwords, a practice which has been a major source of security problems, particularly in IoT.Wording on error messages and warnings is also critical to ensuring that users understand what is going on. It can ensure users don't just dismiss a warning because it is too wordy or technical.
See, for example, the and less-avoidable for users. Their A/B testing of how warnings were designed and worded led to a huge decrease in the number of users who allowed their browsers to access malicious sites, from 70% in 2013 down to 33% (Windows) and 17% (Android) in 2017. The EU has also published a study on .
OWASP notes that it may be useful to automate scanning for security misconfiguration or unused/unnecessary services. Designers and developers may want to work together to develop flows which periodically present options for users to update their security configurations and service permissions. Google, Facebook, and Twitter have all developed good examples of how to periodically encourage users to review and update their permissions, as well as their backup authentication credentials.
Older OWASP-identified risks: Insufficient attack protection
Insufficient attack protection didn't make the Top Ten in this round of OWASP recommendations, but it's still a risk UX professionals should educate themselves about. Repeated attempts to access an application are an indicator that someone is trying to attack a system. That’s why many systems limit the number of login attempts a user is allowed to make. You should learn best practices for securely messaging around login attempts. As previously mentioned, error messages should not give attackers too much detail on why their attempt failed. Work with developers and security staff to ensure that user flows around login account for attacker patterns.Include UX professionals in security conversations, too
Interestingly, OWASP itself recommends a series of design reviews to check the security of systems but doesn’t suggest that designers be included in that process. Given that interfaces and user flows may need to change to support better security, experience designers should certainly be included in any security reviews touching these elements.User testing, as well as design review, is critical to ensure interfaces, instructions, and other messages aren’t confusing or frightening to users. Including UX personnel and end users in your security reviews will ensure that the recommendations from security reviews are implemented in ways that don’t frustrate users, cause them to stop using the tool, or make bad security decisions in spite of developers' best efforts.
Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of ºÚÁÏÃÅ.