Note: This page is part of the archive.

This document is part of the US-CERT website archive. These documents are no longer updated and may contain outdated information. Links may also no longer function. Please contact if you have any questions about the US-CERT website archive.

Psychological Acceptability

Author(s): Michael Gegick Sean Barnum Maturity Levels and Audience Indicators: L4  / D/P  L  SDLC Life Cycles: Design  Copyright: Copyright © Cigital, Inc. 2005-2007. Cigital retains copyrights to this material.


Accessibility to resources should not be inhibited by security mechanisms. If security mechanisms hinder the usability or accessibility of resources, then users may opt to turn off those mechanisms. Where possible, security mechanisms should be transparent to the users of the system or at most introduce minimal obstruction. Security mechanisms should be user friendly to facilitate their use and understanding in a software application.

Detailed Description Excerpts

According to Saltzer and Schroeder [Saltzer 75] in "Basic Principles of Information Protection" from page 10:

Psychological acceptability: It is essential that the human interface be designed for ease of use, so that users routinely and automatically apply the protection mechanisms correctly. Also, to the extent that the user's mental image of his protection goals matches the mechanisms he must use, mistakes will be minimized. If he must translate his image of his protection needs into a radically different specification language, he will make errors.

According to Bishop [Bishop 03] in Chapter 13, "Design Principles," in "Principle of Psychological Acceptability" from pages 348-349:1

This principle recognizes the human element in computer security.

Definition 13-8. The principle of psychological acceptability states that security mechanisms should not make the resource more difficult to access than if the security mechanisms were not present.

Configuring and executing a program should be as easy and as intuitive as possible, and any output should be clear, direct, and useful. If security-related software is too complicated to configure, system administrators may unintentionally set up the software in a non-secure manner. Similarly, security-related user programs must be easy to use and output understandable messages. If a password is rejected, the password changing program should state why it is rejected rather than giving a cryptic error message. If a configuration file has an incorrect parameter, the error message should describe the proper parameter.

On the other hand, security requires that the messages impart no unnecessary information.

In practice, the principle is interpreted to mean that the security mechanism may add some extra burden, but that burden must be both minimal and reasonable.

Example 1

The ssh program2 allows a user to set up a public key mechanism for enciphering communications between systems. The installation and configuration mechanisms for the UNIX version allow one to arrange that the public key be stored locally without any password protection. In this case, one need not supply a password to connect to the remote system, but still obtains the enciphered connection.

This mechanism satisfies the principle of psychological acceptability.

Example 2

When a user supplies the wrong password during login, the system should reject the attempt with a message stating that the login failed. If it were to say that the password was incorrect, the user would know that the account name was legitimate. If the ?user? were really an unauthorized attacker, she now knows an account for which she can try to guess a password.

Example 3

A mainframe system allows users to place passwords on files. Accessing the files requires that the program supply the password. Although this mechanism violates the principle as stated, it is considered sufficiently minimal to be acceptable. On an interactive system, where the pattern of file accesses is more frequent and more transient, this requirement would be too great a burden to be acceptable.


[Bishop 03]

Bishop, Matt. Computer Security: Art and Science. Boston, MA: Addison-Wesley, 2003.

[Saltzer 75]

Saltzer, Jerome H. & Schroeder, Michael D. "The Protection of Information in Computer Systems," 1278-1308. Proceedings of the IEEE 63, 9 (September 1975).

  • 1. All rights reserved. It is reprinted with permission from Addison-Wesley Professional.
  • 2. Wulf, W.; Cohen, E.; Corwin, W.; Jones, A.; Levin, R.; Pierson, C.; & Pollack, F. "HYDRA: The Kernel of a Multiprocessor System." Communications of the ACM 17, 6 (June 1974): 337-345.