Saltzer and Schroeder Principles of Secure System DesignRate:


Table of Contents
Saltzer and Schroeder Principles of Secure System Design
Tags: Cyber Security, Cybersecurity, System Design, Computer Security

When it comes to building securesystems, especially ones used in important places like the government or military, certain rules or principles help guide the process. Back in 1975, two researchers Saltzer and Schroeder created a set of these security design principles.

1. Economy of Mechanism (Keep It Simple)

The simpler a security design is, the better. Simple systems are;

This principle supports the idea of having a Trusted Computing Base (TCB) - a small, trusted part of the system that is responsible for keeping it secure. A smaller TCB means fewer chances of something going wrong.

2. Fail-Safe Defaults (Default Deny)

By default, users or programs should not be allowed to access things unless they are explicitly given permission. This means;

This is safer than trying to detect and block bad behavior because hackers are constantly changing their tactics. Some modern tools like antivirus software break this rule by trying to detect and block threats instead of just allowing known safe things.

3. Complete Mediation (Check Everything, Every Time)

Every time someone or something tries to access a resource (like a file or system), the system should check if it is allowed based on the security rules. No shortcuts!

Even if the user was allowed last time, the system should check again. This avoids giving unauthorized access due to reused permissions.

4. Open Design (No Security Secrets)

Security should not depend on hiding how the system works. Instead, it should rely only on secret keys or passwords.

This way, engineers, auditors, and others can check the system for errors or flaws. Hiding how something works (known as "security by obscurity") is risky — especially if someone can figure it out or is an insider.

5. Separation of Privilege (Two or More People Must Approve)

Some actions should require multiple people or permissions to go through.

For example, in banking, large transactions often require approval from more than one person. This adds extra safety but can also slow things down. Still, it's useful for critical operations where mistakes or misuse would be costly.

6. Least Privilege (Only What's Needed)

People or programs should only have the minimum permissions they need to do their jobs. If a user just needs to read a file, they shouldn't be allowed to delete or edit it. This limits damage if something goes wrong — like a virus infecting a program. Security systems often split big systems into small parts, each with limited powers, to follow this rule.

7. Least Common Mechanism (Don't Share Too Much)

Sharing system parts (like memory or processors) between users or programs can be risky. A shared part could:

So, it’s better to avoid shared tools unless really necessary. In extreme cases, completely separate systems (“air-gapped”) are used to keep things safe.

8. Psychological Acceptability (Easy to Use)

Security should be easy and natural for users. If people find a system too hard or confusing, they’ll make mistakes or try to avoid using it. When the protection tools match how users think, security improves. This idea led to a whole field called Human Factors in Security.

9. Two Additional Principles

Saltzer and Schroeder also mentioned two more principles. These are useful, but not always perfectly reliable.


8.1 Work Factor (Make Hacking Too Expensive)

Good security means it’s too expensive or time-consuming for attackers to break in. For example, if cracking a password would take 10 years, most hackers won’t bother. However, this is harder to measure for things like insider threats or software bugs.


8.2 Compromise Recording (Log Everything)

Sometimes, instead of stopping an attack, it helps to keep good records so you know when and how something went wrong. Security logs help detect and respond to problems quickly. But relying only on logs instead of prevention can be risky, depending on the situation.


10. Final Thoughts

Saltzer and Schroeder’s principles are like the golden rules of secure system design. While technology has changed a lot since 1975, these ideas still help guide engineers, developers, and security professionals in building safer systems. Whether you're designing software, managing networks, or just curious about how digital security works — these principles are a great place to start.

Author: Mikhail

No comments yet.

You must be logged in to leave a comment. Login here


Thread Back to Threads Thread

You May Also Like

Introduction to Cyber Forensics
Tags: Cybersecurity, Cyber Security, Cyber Forensic

Cyber crimes have increased manifolds these days. People are losing their hard earned money and reputation, just because some people with bad intentions are sitting there to make money off their negligence.
Take back your privacy
Tags: Privacy

Social Networking has changed the way it used to be. When social networking websites emerged, people used to have their accounts represented by their usernames, and today, everything has just connected with our real lives.
Road Trip from Hyderabad to Jammu and Kashmir
Tags: Road Trip, Hyderabad, Jammu and Kashmir, J&K, Travel

It has been a long time since I have been planning to go to Jammu and Kashmir, and it seems now the time has come when I should take on this expedition/road trip. In this thread, I am going to discuss my plan, and how I am going to execute it over the course of time and while on the journey as well. So, it will be more like a diary entry that I will be doing, which can also be used as a guidance if someone else is planning for a trip like this.
What is Spamdexing?
Tags: Spamdexing, Digital Marketing, SEO, Black-hat SEO

Spamdexing (also known as search engine spam, search engine poisoning, black-hat search engine optimization, search spam, or web spam) is the deliberate manipulation of search engine indexes.