Saltzer and Schroeder Principles of Secure System DesignRate:


Table of Contents
Saltzer and Schroeder Principles of Secure System Design
Tags: Cyber Security, Cybersecurity, System Design, Computer Security

When it comes to building securesystems, especially ones used in important places like the government or military, certain rules or principles help guide the process. Back in 1975, two researchers Saltzer and Schroeder created a set of these security design principles.

1. Economy of Mechanism (Keep It Simple)

The simpler a security design is, the better. Simple systems are;

This principle supports the idea of having a Trusted Computing Base (TCB) - a small, trusted part of the system that is responsible for keeping it secure. A smaller TCB means fewer chances of something going wrong.

2. Fail-Safe Defaults (Default Deny)

By default, users or programs should not be allowed to access things unless they are explicitly given permission. This means;

This is safer than trying to detect and block bad behavior because hackers are constantly changing their tactics. Some modern tools like antivirus software break this rule by trying to detect and block threats instead of just allowing known safe things.

3. Complete Mediation (Check Everything, Every Time)

Every time someone or something tries to access a resource (like a file or system), the system should check if it is allowed based on the security rules. No shortcuts!

Even if the user was allowed last time, the system should check again. This avoids giving unauthorized access due to reused permissions.

4. Open Design (No Security Secrets)

Security should not depend on hiding how the system works. Instead, it should rely only on secret keys or passwords.

This way, engineers, auditors, and others can check the system for errors or flaws. Hiding how something works (known as "security by obscurity") is risky — especially if someone can figure it out or is an insider.

5. Separation of Privilege (Two or More People Must Approve)

Some actions should require multiple people or permissions to go through.

For example, in banking, large transactions often require approval from more than one person. This adds extra safety but can also slow things down. Still, it's useful for critical operations where mistakes or misuse would be costly.

6. Least Privilege (Only What's Needed)

People or programs should only have the minimum permissions they need to do their jobs. If a user just needs to read a file, they shouldn't be allowed to delete or edit it. This limits damage if something goes wrong — like a virus infecting a program. Security systems often split big systems into small parts, each with limited powers, to follow this rule.

7. Least Common Mechanism (Don't Share Too Much)

Sharing system parts (like memory or processors) between users or programs can be risky. A shared part could:

So, it’s better to avoid shared tools unless really necessary. In extreme cases, completely separate systems (“air-gapped”) are used to keep things safe.

8. Psychological Acceptability (Easy to Use)

Security should be easy and natural for users. If people find a system too hard or confusing, they’ll make mistakes or try to avoid using it. When the protection tools match how users think, security improves. This idea led to a whole field called Human Factors in Security.

9. Two Additional Principles

Saltzer and Schroeder also mentioned two more principles. These are useful, but not always perfectly reliable.


8.1 Work Factor (Make Hacking Too Expensive)

Good security means it’s too expensive or time-consuming for attackers to break in. For example, if cracking a password would take 10 years, most hackers won’t bother. However, this is harder to measure for things like insider threats or software bugs.


8.2 Compromise Recording (Log Everything)

Sometimes, instead of stopping an attack, it helps to keep good records so you know when and how something went wrong. Security logs help detect and respond to problems quickly. But relying only on logs instead of prevention can be risky, depending on the situation.


10. Final Thoughts

Saltzer and Schroeder’s principles are like the golden rules of secure system design. While technology has changed a lot since 1975, these ideas still help guide engineers, developers, and security professionals in building safer systems. Whether you're designing software, managing networks, or just curious about how digital security works — these principles are a great place to start.

Author: Mikhail

No comments yet.

You must be logged in to leave a comment. Login here


Thread Back to Threads Thread

You May Also Like

Data Analysis
Tags: Data Analysis, Descriptive Analytics, Diagnostic Analytics, Predictive Analytics, Prescriptive Analytics, Cognitive Analytics

Data analysis is the process of identifying, cleaning, transforming, and modeling data to discover meaningful and valuable information.
Common Cyber Threats Every Business Should Know
Tags: Cyber Threats, Cyber Security, Cybersecurity, Hack

In today's digital landscape, businesses of all sizes face an increasing number of cyber threats. Cybercriminals continuously evolve their tactics to exploit vulnerabilities, making it crucial for organizations to stay informed and implement robust security measures.
Zeashta Devi Shrine Kashmir
Tags: Kashmir, Tourism, Travel, Religious Places

Imagine a peaceful temple nestled in the beautiful mountains of Kashmir. This is the Zeashta Devi Shrine, also called Zeathyar. It is a special place for Kashmiri Pandits and people from all over who want to connect with their spiritual side.
Silk Road Website
Tags: Silk Road, Website, Dark Market

Silk Road was an online black market and the first modern darknet market. It was launched in 2011 by its American founder Ross Ulbricht under the pseudonym 'Dread Pirate Roberts'.