Designing
and Building Secure Software
Slide deck courtesy of Prof. Michael Hicks, University of Maryland, College Park (UMD)
Making secure software
•
Making secure software
Flawed approach: Design and build software, and ignore security at first
Add security once the functional requirements are
•
satisfied
•
Making secure software
Flawed approach: Design and build software, and ignore security at first
•
•
Add security once the functional requirements are
•
satisfied
Better approach: Build security in from the start Incorporate security-minded thinking into all phases of
the development process
Development process
Development process
Many development processes; four common
•
phases:
•
• • • •
Development process
Many development processes; four common phases:
Requirements Design Implementation Testing/assurance
•
• • • • •
Development process
Many development processes; four common phases:
Requirements
Design
Implementation
Testing/assurance
Phases of development apply to the whole project, its individual components, and its refinements/iterations
•
• • • • •
•
Development process
Many development processes; four common phases:
Requirements
Design
Implementation
Testing/assurance
Phases of development apply to the whole project, its individual components, and its refinements/iterations
Where does security engineering fit in?
•
• • • • •
•
•
Development process
Many development processes; four common phases:
Requirements
Design
Implementation
Testing/assurance
Phases of development apply to the whole project, its individual components, and its refinements/iterations
Where does security engineering fit in? All phases!
Security engineering Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
• • • •
Security engineering Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
• • • •
Activities
Security engineering
Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
Security Requirements
• • • •
Activities
Security engineering
Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
Security Requirements Abuse Cases
• • • •
Activities
• • • •
Security engineering
Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
Security Requirements Abuse Cases
Architectural Risk Analysis
Activities
Security engineering
Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
Security Requirements Abuse Cases
Architectural Risk Analysis Security-oriented Design
• • • •
Activities
Security engineering
Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
Security Requirements Abuse Cases
Architectural Risk Analysis
Security-oriented Design Code Review (with tools)
Activities
• • • •
Security engineering
Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
Security Requirements Abuse Cases
Architectural Risk Analysis
Security-oriented Design Code Review (with tools)
Risk-based Security Tests
Activities
• • • •
Security engineering
Phases
Requirements Design Implementation Testing/assurance
Note that different SD processes have different phases and artifacts, but all involve the basics above. We’ll keep it simple and refer to these.
Security Requirements Abuse Cases
Architectural Risk Analysis
Security-oriented Design Code Review (with tools)
Risk-based Security Tests Penetration Testing
Activities
• • • •
Software vs. Hardware
System design contains software and hardware
Mostly, we are focusing on the software
•
•
Note
Software vs. Hardware
System design contains software and hardware
Mostly, we are focusing on the software
Software is malleable and easily changed Advantageous to core functionality
Harmful to security (and performance)
•
•
•
•
•
Note
•
•
•
•
•
•
•
•
–
Software vs. Hardware
System design contains software and hardware
Mostly, we are focusing on the software
Software is malleable and easily changed Advantageous to core functionality
Harmful to security (and performance) Hardware is fast, but hard to change
Disadvantageous to evolution
Advantage to security
Can’t be exploited easily, or changed by an attack
Note
•
Security functionality in hardware
Secure Hardware
•
•
Secure Hardware
Security functionality in hardware
Intel’s AES-NI implements cryptography instructions
•
• •
–
Secure Hardware
Security functionality in hardware
Intel’s AES-NI implements cryptography instructions Intel SGX supports “encrypted computation”
For cloud computing applications
•
• •
–
•
Hardware primitives for security
Secure Hardware
Security functionality in hardware
Intel’s AES-NI implements cryptography instructions Intel SGX supports “encrypted computation”
For cloud computing applications
•
• •
–
•
•
Secure Hardware
Security functionality in hardware
Intel’s AES-NI implements cryptography instructions Intel SGX supports “encrypted computation”
For cloud computing applications
Hardware primitives for security
Physically uncloneable functions (PUFs)
Source of unpredictable, but repeatable,
–
randomness, useful for authentication
•
• •
–
•
•
Secure Hardware
Security functionality in hardware
Intel’s AES-NI implements cryptography instructions Intel SGX supports “encrypted computation”
For cloud computing applications
Hardware primitives for security
Physically uncloneable functions (PUFs)
Source of unpredictable, but repeatable,
–
randomness, useful for authentication
Intel MPX – primitives for fast memory safety
•
enforcement
Running Example: On-line banking
Bob’s:
Bob
Alice
Alice’s:
Running Example: On-line banking
Bob’s:
Bob
Alice
Alice’s:
Running Example: On-line banking
Bob’s:
Bob
Alice
Alice’s:
Running Example: On-line banking
Bob’s:
✓
Bob
Alice
Alice’s:
Threat Modeling (Architectural Risk Analysis)
Threat Model
•
Threat Model
The threat model makes explicit the adversary’s
assumed powers
Consequence: The threat model must match reality,
•
otherwise the risk analysis of the system will be wrong
•
Threat Model
The threat model makes explicit the adversary’s
assumed powers
•
•
Consequence: The threat model must match reality,
•
otherwise the risk analysis of the system will be wrong
The threat model is critically important
If you are not explicit about what the attacker can do, how can you assess whether your design will repel that attacker?
•
Threat Model
The threat model makes explicit the adversary’s
assumed powers
•
•
•
This is part of architectural risk analysis
Consequence: The threat model must match reality,
•
otherwise the risk analysis of the system will be wrong
The threat model is critically important
If you are not explicit about what the attacker can do, how can you assess whether your design will repel that attacker?
Example: Network User
Example: Network User
An (anonymous) user that can connect to a
•
service via the network
Example: Network User
•
•
measure the size and timing of requests and responses
An (anonymous) user that can connect to a Can:
•
service via the network
Example: Network User
•
•
•
measure the size and timing of requests and responses
run parallel sessions
An (anonymous) user that can connect to a Can:
•
service via the network
Example: Network User
•
•
• •
measure the size and timing of requests and responses
run parallel sessions
provide malformed inputs, malformed
messages
An (anonymous) user that can connect to a Can:
•
service via the network
Example: Network User
•
•
• •
•
measure the size and timing of requests and responses
run parallel sessions
provide malformed inputs, malformed
messages
drop or send extra messages
An (anonymous) user that can connect to a Can:
•
service via the network
Example: Network User
An (anonymous) user that can connect to a Can:
•
service via the network
•
•
• •
•
measure the size and timing of requests and responses
run parallel sessions
provide malformed inputs, malformed
messages
drop or send extra messages
Example attacks: SQL injection, XSS,
•
CSRF, buffer overrun/ROP payloads, …
Example: Snooping User
Internet user on the same network as
•
other users of some service
Example: Snooping User Internet user on the same network as
•
other users of some service
For example, someone connected to an
•
unencrypted Wi-Fi network at a coffee shop
Example: Snooping User Internet user on the same network as
•
other users of some service
For example, someone connected to an
•
unencrypted Wi-Fi network at a coffee shop
•
•
Thus, can additionally Read/measure others’ messages,
Example: Snooping User Internet user on the same network as
•
other users of some service
For example, someone connected to an Thus, can additionally
•
unencrypted Wi-Fi network at a coffee shop
•
•
•
Read/measure others’ messages, Intercept, duplicate, and modify messages
Example: Snooping User Internet user on the same network as
•
other users of some service
For example, someone connected to an Thus, can additionally
•
unencrypted Wi-Fi network at a coffee shop
•
•
•
Read/measure others’ messages, Intercept, duplicate, and modify messages
Example attacks: Session hijacking (and channel attack, denial of service
•
other data theft), privacy-violating side-
Example: Co-located User
Internet user on the same machine as
•
other users of some service
Example: Co-located User
Internet user on the same machine as
other users of some service
E.g., malware installed on a user’s laptop
•
•
Example: Co-located User
E.g., malware installed on a user’s laptop Thus, can additionally
Read/write user’s files (e.g., cookies) and memory
Internet user on the same machine as
•
other users of some service
•
•
•
Example: Co-located User
E.g., malware installed on a user’s laptop Thus, can additionally
Read/write user’s files (e.g., cookies) and memory
Snoop keypresses and other events
Internet user on the same machine as
•
other users of some service
•
•
•
•
Example: Co-located User
E.g., malware installed on a user’s laptop Thus, can additionally
Read/write user’s files (e.g., cookies) and memory
Snoop keypresses and other events Read/write the user’s display (e.g., to spoof)
Internet user on the same machine as
•
other users of some service
•
•
•
• •
Example: Co-located User
E.g., malware installed on a user’s laptop Thus, can additionally
Read/write user’s files (e.g., cookies) and memory
Snoop keypresses and other events Read/write the user’s display (e.g., to spoof)
•
•
•
• •
Internet user on the same machine as
•
other users of some service
Example attacks: Password theft (and
•
other credentials/secrets)
Threat-driven Design
•
Threat-driven Design Different threat models will elicit different responses
•
•
•
•
Threat-driven Design Different threat models will elicit different responses
Network-only attackers implies message traffic is safe No need to encrypt communications
This is what telnet remote login software assumed
•
•
•
•
•
•
–
Threat-driven Design Different threat models will elicit different responses
Network-only attackers implies message traffic is safe No need to encrypt communications
This is what telnet remote login software assumed Snooping attackers means message traffic is visible
So use encrypted wifi (link layer), encrypted network layer
(IPsec), or encrypted application layer (SSL) Which is most appropriate for your system?
•
•
•
•
•
•
–
•
•
Threat-driven Design Different threat models will elicit different responses
Network-only attackers implies message traffic is safe No need to encrypt communications
This is what telnet remote login software assumed Snooping attackers means message traffic is visible
So use encrypted wifi (link layer), encrypted network layer
(IPsec), or encrypted application layer (SSL) Which is most appropriate for your system?
Co-located attacker can access local files, memory Cannot store unencrypted secrets, like passwords
Bad Model = Bad Security
Any assumptions you make in your model are
•
potential holes that the adversary can exploit
Bad Model = Bad Security
Any assumptions you make in your model are
•
potential holes that the adversary can exploit
•
•
E.g.: Assuming no snooping users no longer valid
Prevalence of wi-fi networks in most deployments
Bad Model = Bad Security
•
•
•
Any assumptions you make in your model are E.g.: Assuming no snooping users no longer valid
•
potential holes that the adversary can exploit
•
Prevalence of wi-fi networks in most deployments
Other mistaken assumptions
Assumption: Encrypted traffic carries no information
Not true! By analyzing the size and distribution of messages, you
–
can infer application state
Bad Model = Bad Security
•
•
•
Assumption: Encrypted traffic carries no information Not true! By analyzing the size and distribution of messages, you
•
Any assumptions you make in your model are E.g.: Assuming no snooping users no longer valid
•
potential holes that the adversary can exploit
•
Prevalence of wi-fi networks in most deployments
Other mistaken assumptions
Assumption: Timing channels carry little information
–
can infer application state
Not true! Timing measurements of previous RSA implementations
–
could be used eventually reveal a remote SSL secret key
Finding a good model
•
•
Finding a good model
Compare against similar systems
What attacks does their design contend with?
•
•
•
•
Finding a good model
Compare against similar systems
What attacks does their design contend with?
Understand past attacks and attack patterns
How do they apply to your system?
•
•
•
•
•
•
– •
–
Finding a good model
Compare against similar systems
What attacks does their design contend with?
Understand past attacks and attack patterns
How do they apply to your system?
Challenge assumptions in your design
What happens if an assumption is untrue? What would a breach potentially cost you?
How hard would it be to get rid of an assumption,
allowing for a stronger adversary? What would that development cost?
Security Requirements
Security Requirements
•
Security Requirements
Software requirements typically about what the software should do
•
•
Security Requirements Software requirements typically about what the
software should do
We also want to have security requirements
•
•
•
Security Requirements Software requirements typically about what the
software should do
We also want to have security requirements
Security-related goals (or policies)
Example: One user’s bank account balance should not be learned
–
by, or modified by, another user, unless authorized
•
•
•
•
Security-related goals (or policies)
Example: One user’s bank account balance should not be learned
Security Requirements Software requirements typically about what the
software should do
We also want to have security requirements
Required mechanisms for enforcing them
–
by, or modified by, another user, unless authorized
•
•
•
•
Security-related goals (or policies)
Example: One user’s bank account balance should not be learned
Security Requirements Software requirements typically about what the
software should do
We also want to have security requirements
–
by, or modified by, another user, unless authorized
Required mechanisms for enforcing them Example:
–
1.Users identify themselves using passwords,
•
•
•
•
–
Security-related goals (or policies)
Example: One user’s bank account balance should not be learned
Security Requirements Software requirements typically about what the
software should do
We also want to have security requirements
Required mechanisms for enforcing them Example:
1.Users identify themselves using passwords, 2.Passwords must be “strong,” and
–
by, or modified by, another user, unless authorized
•
•
•
•
–
Security-related goals (or policies)
Example: One user’s bank account balance should not be learned
Security Requirements Software requirements typically about what the
software should do
We also want to have security requirements
Required mechanisms for enforcing them Example:
1.Users identify themselves using passwords,
2.Passwords must be “strong,” and
3.The password database is only accessible to login program.
–
by, or modified by, another user, unless authorized
Typical Kinds of Requirements
Typical Kinds of Requirements
Policies
Confidentiality (and Privacy and Anonymity) Integrity
Availability
•
•
• •
Typical Kinds of Requirements
Policies
Confidentiality (and Privacy and Anonymity) Integrity
Availability
Supporting mechanisms Authentication Authorization Auditability
•
•
• •
•
• • •
Privacy and Confidentiality
Privacy and Confidentiality
Called privacy for individuals, confidentiality for data
Definition: Sensitive information not leaked to
•
unauthorized parties
•
Secrecy vs. Privacy? https://www.youtube.com/watch?v=Nlf7YM71k5U
Privacy and Confidentiality
•
Definition: Sensitive information not leaked to
•
unauthorized parties
Called privacy for individuals, confidentiality for data Example policy: bank account status (including
•
balance) known only to the account owner
Secrecy vs. Privacy? https://www.youtube.com/watch?v=Nlf7YM71k5U
Privacy and Confidentiality
Definition: Sensitive information not leaked to
•
unauthorized parties
•
Called privacy for individuals, confidentiality for data Example policy: bank account status (including
•
balance) known only to the account owner
•
Leaking directly or via side channels
Example: manipulating the system to directly display Example: determining Bob has an account at Bank A
•
Bob’s bank balance to Alice
•
according to shorter delay on login failure
Secrecy vs. Privacy? https://www.youtube.com/watch?v=Nlf7YM71k5U
Anonymity
•
Anonymity A specific kind of privacy
•
•
Anonymity A specific kind of privacy
Example: Non-account holders should be able to
browse the bank informational site without being
tracked
Here the adversary is the bank
The previous examples considered other account
holders as possible adversaries
• •
Integrity
•
Definition: Sensitive information not damaged by (computations acting on behalf of) unauthorized parties
Integrity
•
Definition: Sensitive information not damaged by (computations acting on behalf of) unauthorized parties
Integrity
Example: Only the account owner can authorize
•
withdrawals from her account
•
Definition: Sensitive information not damaged by (computations acting on behalf of) unauthorized parties
•
withdrawals from her account
•
Integrity
Example: Only the account owner can authorize Violations of integrity can also be direct or indirect
Example: Being able specifically withdraw from the
•
account vs. confusing the system into doing it
Availability
•
Availability Definition: A system is responsive to requests
•
Availability Definition: A system is responsive to requests
Example: a user may always access her account
•
for balance queries or withdrawals
•
Availability Definition: A system is responsive to requests
Example: a user may always access her account Denial of Service (DoS) attacks attempt to
•
for balance queries or withdrawals
•
• •
compromise availability
by busying a system with useless work or cutting off network access
Supporting mechanisms
Leslie Lamport’s Gold Standard defines mechanisms provided by a system to enforce its requirements
•
Supporting mechanisms
•
Leslie Lamport’s Gold Standard defines mechanisms provided by a system to enforce its requirements
Gold Standard
Supporting mechanisms
Gold Standard
•
• • •
Leslie Lamport’s Gold Standard defines
mechanisms provided by a system to enforce its
requirements
Authentication Authorization Audit
Supporting mechanisms
Gold Standard
•
•
Leslie Lamport’s Gold Standard defines
•
• • •
The gold standard is both requirement and design The sorts of policies that are authorized determines the
mechanisms provided by a system to enforce its
requirements
Authentication Authorization Audit
authorization mechanism
The sorts of users a system has determines how they
•
should be authenticated
•
Authentication What is the subject of security policies?
•
•
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
•
•
–
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
a.k.a. a principal
•
•
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
a.k.a. a principal
How can system tell a user is who he says he is?
–
•
•
•
–
•
•
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
a.k.a. a principal
How can system tell a user is who he says he is?
What (only) he knows (e.g., password)
•
•
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
a.k.a. a principal
How can system tell a user is who he says he is?
What (only) he knows (e.g., password) What he is (e.g., biometric)
–
•
• •
•
•
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
a.k.a. a principal
How can system tell a user is who he says he is?
What (only) he knows (e.g., password) What he is (e.g., biometric)
What he has (e.g., smartphone)
–
•
• • •
•
•
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
a.k.a. a principal
How can system tell a user is who he says he is?
What (only) he knows (e.g., password)
What he is (e.g., biometric)
What he has (e.g., smartphone)
Authentication mechanisms that employ more than one of these factors are called multi-factor authentication
–
•
• • • •
•
•
Authentication What is the subject of security policies?
Need to define a notion of identity and a way to connect an action with an identity
a.k.a. a principal
How can system tell a user is who he says he is?
What (only) he knows (e.g., password)
What he is (e.g., biometric)
What he has (e.g., smartphone)
Authentication mechanisms that employ more than one of these factors are called multi-factor authentication
–
•
• • • •
E.g., bank may employ passwords and text of a special
–
code to a user’s smart phone
Authorization
•
Authorization
Defines when a principal may perform an action
•
Authorization
Defines when a principal may perform an action
Example: Bob is authorized to access his own
•
account, but not Alice’s account
•
Authorization
Defines when a principal may perform an action
Example: Bob is authorized to access his own There are a wide variety of policies that define what
•
account, but not Alice’s account
•
actions might be authorized
E.g., access control policies, which could be originator
•
based, role-based, user-based, etc.
Audit
Audit
Retain enough information to be able to determine establish one did not occur)
•
the circumstances of a breach or misbehavior (or
Audit
Retain enough information to be able to determine establish one did not occur)
•
the circumstances of a breach or misbehavior (or
•
Such information, often stored in log files, must be protected from tampering, and from access that might violate other policies
Audit
Retain enough information to be able to determine establish one did not occur)
•
the circumstances of a breach or misbehavior (or
•
Such information, often stored in log files, must be protected from tampering, and from access that might violate other policies
Example: Every account-related action is logged
•
locally and mirrored at a separate site
Defining Security Requirements
Many processes for deciding security requirements
•
Defining Security Requirements
Many processes for deciding security requirements Example: General policy concerns
• •
Defining Security Requirements
Many processes for deciding security requirements
Example: General policy concerns
Due to regulations/standards (HIPAA, SOX, etc.)
•
•
•
Defining Security Requirements
Many processes for deciding security requirements Example: General policy concerns
Due to regulations/standards (HIPAA, SOX, etc.) Due organizational values (e.g., valuing privacy)
•
•
•
•
Defining Security Requirements
Many processes for deciding security requirements Example: General policy concerns
Due to regulations/standards (HIPAA, SOX, etc.) Due organizational values (e.g., valuing privacy)
Example: Policy arising from threat modeling
•
•
•
•
•
Defining Security Requirements
Many processes for deciding security requirements Example: General policy concerns
Due to regulations/standards (HIPAA, SOX, etc.) Due organizational values (e.g., valuing privacy)
Example: Policy arising from threat modeling
Which attacks cause the greatest concern?
•
•
•
•
•
•
Who are the likely adversaries and what are their goals and
–
methods?
Defining Security Requirements
Many processes for deciding security requirements Example: General policy concerns
Due to regulations/standards (HIPAA, SOX, etc.) Due organizational values (e.g., valuing privacy)
Example: Policy arising from threat modeling
Which attacks cause the greatest concern? Who are the likely adversaries and what are their goals and
Which attacks have already occurred?
Within the organization, or elsewhere on related systems?
•
•
•
•
•
•
•
–
–
methods?
Abuse Cases
•
Abuse Cases Abuse cases illustrate security requirements
• •
Abuse Cases Abuse cases illustrate security requirements
Where use cases describe what a system should do, abuse cases describe what it should not do
• •
Abuse Cases Abuse cases illustrate security requirements
Where use cases describe what a system should do, abuse cases describe what it should not do
Example use case: The system allows bank
•
managers to modify an account’s interest rate
• •
Abuse Cases Abuse cases illustrate security requirements
Where use cases describe what a system should do, abuse cases describe what it should not do
Example use case: The system allows bank
•
managers to modify an account’s interest rate
•
Example abuse case: A user is able to spoof being a manager and thereby change the interest rate on an account
Defining Abuse Cases
Defining Abuse Cases
Using attack patterns and likely scenarios, construct could violate a security requirement
•
cases in which an adversary’s exercise of power
Defining Abuse Cases
Using attack patterns and likely scenarios, construct
•
cases in which an adversary’s exercise of power
•
could violate a security requirement
Based on the threat model
Defining Abuse Cases
Using attack patterns and likely scenarios, construct could violate a security requirement
•
cases in which an adversary’s exercise of power
• •
Based on the threat model
What might occur if a security measure was removed?
Defining Abuse Cases
Using attack patterns and likely scenarios, construct could violate a security requirement
•
cases in which an adversary’s exercise of power
• •
Based on the threat model
What might occur if a security measure was removed?
Example: Co-located attacker steals password file
•
and learns all user passwords
Defining Abuse Cases
Using attack patterns and likely scenarios, construct could violate a security requirement
•
cases in which an adversary’s exercise of power
•
•
• •
Based on the threat model
What might occur if a security measure was removed?
Example: Co-located attacker steals password file
and learns all user passwords
Possible if password file is not encrypted
•
Possible if password file is not encrypted
Defining Abuse Cases
Using attack patterns and likely scenarios, construct could violate a security requirement
•
cases in which an adversary’s exercise of power
• •
Based on the threat model
What might occur if a security measure was removed?
Example: Co-located attacker steals password file
•
and learns all user passwords
Example: Snooping attacker replays a captured
•
message, effecting a bank withdrawal
•
•
Defining Abuse Cases
Using attack patterns and likely scenarios, construct could violate a security requirement
•
cases in which an adversary’s exercise of power
• •
Based on the threat model
What might occur if a security measure was removed?
Example: Co-located attacker steals password file
•
and learns all user passwords
Possible if password file is not encrypted
Example: Snooping attacker replays a captured message, effecting a bank withdrawal
Possible if messages are have no nonce
•
Design Flaws
Design Defects = Flaws
Design Defects = Flaws
Flaws are problems in the design
Bugs are problems in the implementation
Recall that software defects consist of both flaws
•
and bugs
• •
Design Defects = Flaws
Flaws are problems in the design
Bugs are problems in the implementation
We avoid flaws during the design phase
Recall that software defects consist of both flaws
•
and bugs
•
• •
Design Defects = Flaws
Flaws are problems in the design
Bugs are problems in the implementation
We avoid flaws during the design phase
According to Gary McGraw,
50% of security problems are flaws
So this phase is very important
Recall that software defects consist of both flaws
•
and bugs
•
•
•
• •
Design vs. Implementation?
Design vs. Implementation?
Many different levels of system design decisions
•
Design vs. Implementation?
Many different levels of system design decisions
•
Highest level: main actors (processes), interactions,
•
and programming language(s) to use
Design vs. Implementation?
Many different levels of system design decisions
Next level: decomposition of an actor into modules/ components, identifying the core functionalities and how they work together
•
Highest level: main actors (processes), interactions,
•
and programming language(s) to use
•
Design vs. Implementation?
Many different levels of system design decisions
Next level: decomposition of an actor into modules/ components, identifying the core functionalities and how they work together
•
Highest level: main actors (processes), interactions,
•
and programming language(s) to use
•
Next level: how to implement data types and functions,
•
e.g., purely functionally, or using parallelism, etc.
Design vs. Implementation?
Many different levels of system design decisions
Next level: decomposition of an actor into modules/ components, identifying the core functionalities and how they work together
•
•
•
•
Highest level: main actors (processes), interactions,
•
and programming language(s) to use
Next level: how to implement data types and functions,
•
e.g., purely functionally, or using parallelism, etc.
Last two could be implementation or design, or both The distinction is a bit fuzzy
Secure Software Design
Design software architecture according to good principles and rules
Secure Software Design
Design software architecture according to good principles and rules
Secure Software Design
Design software architecture according to good principles and rules
Risk-based analysis of software architecture’s design
Secure Software Design
Design software architecture according to good principles and rules
Risk-based analysis of software architecture’s design
Principles and Rules
Principles and Rules
A principle is a high-level design goal with many
•
possible manifestations
Principles and Rules
A principle is a high-level design goal with many
•
possible manifestations
A rule is a specific practice that is consonant with
•
sound design principles
Principles and Rules
A principle is a high-level design goal with many A rule is a specific practice that is consonant with
•
possible manifestations
•
sound design principles
•
–
The difference between these two can be fuzzy, just as design vs. implementation is fuzzy.
For example, there is often a principle underlying specific practices
Principles and Rules
A principle is a high-level design goal with many A rule is a specific practice that is consonant with
•
possible manifestations
•
sound design principles
The difference between these two can be fuzzy, just
•
as design vs. implementation is fuzzy.
–
•
For example, there is often a principle underlying specific practices Principles often overlap
–
•
Principles often overlap
Principles and Rules
A principle is a high-level design goal with many A rule is a specific practice that is consonant with
•
possible manifestations
•
sound design principles
The difference between these two can be fuzzy, just For example, there is often a principle underlying specific practices
•
as design vs. implementation is fuzzy.
The software design phase tends to focus on
•
principles for avoiding flaws
Categories of Principles
•
Categories of Principles
Prevention
Categories of Principles
Prevention
Goal: Eliminate software defects entirely
•
•
Categories of Principles
Prevention
•
•
Goal: Eliminate software defects entirely
Example: Heartbleed bug would have been prevented by
•
using a type-safe language, like Java
Categories of Principles
Prevention
Mitigation
•
•
Goal: Eliminate software defects entirely
Example: Heartbleed bug would have been prevented by
•
•
using a type-safe language, like Java
Categories of Principles
Prevention
Mitigation
Goal: Reduce the harm from exploitation of unknown defects
•
•
Goal: Eliminate software defects entirely
Example: Heartbleed bug would have been prevented by
•
•
•
using a type-safe language, like Java
Categories of Principles
Prevention
Mitigation
Goal: Reduce the harm from exploitation of unknown defects
Example: Run each browser tab in a separate process, so exploitation of one tab does not yield access to data in another
•
•
Goal: Eliminate software defects entirely
Example: Heartbleed bug would have been prevented by
•
•
•
•
using a type-safe language, like Java
Categories of Principles
Prevention
Mitigation
Goal: Reduce the harm from exploitation of unknown defects
Example: Run each browser tab in a separate process, so exploitation of one tab does not yield access to data in another
Detection (and Recovery)
•
•
Goal: Eliminate software defects entirely
Example: Heartbleed bug would have been prevented by
•
•
•
•
•
using a type-safe language, like Java
Categories of Principles
Prevention
Mitigation
Goal: Reduce the harm from exploitation of unknown defects
Example: Run each browser tab in a separate process, so exploitation of one tab does not yield access to data in another
Detection (and Recovery)
Goal: Identify and understand an attack (and undo damage)
•
•
Goal: Eliminate software defects entirely
Example: Heartbleed bug would have been prevented by
•
•
•
•
•
•
using a type-safe language, like Java
Categories of Principles
Prevention
Mitigation
Goal: Reduce the harm from exploitation of unknown defects
Example: Run each browser tab in a separate process, so exploitation of one tab does not yield access to data in another
Detection (and Recovery)
Goal: Identify and understand an attack (and undo damage) Example: Monitoring (e.g., expected invariants),
•
•
Goal: Eliminate software defects entirely
Example: Heartbleed bug would have been prevented by
•
•
•
•
•
•
using a type-safe language, like Java
•
snapshotting
The Principles
•
Favor simplicity
The Principles
•
•
The Principles
Favor simplicity
Use fail-safe defaults
•
• •
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
•
• •
•
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
•
• •
•
•
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
Employ a small trusted computing base
•
• •
•
• •
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
Employ a small trusted computing base Grant the least privilege possible
•
• •
•
• •
–
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
Employ a small trusted computing base Grant the least privilege possible
Promote privacy
•
• •
•
• •
– –
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
Employ a small trusted computing base Grant the least privilege possible
Promote privacy Compartmentalize
•
• •
•
• •
– –
•
Defend in Depth
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
Employ a small trusted computing base Grant the least privilege possible
Promote privacy Compartmentalize
•
• •
•
• •
– –
•
•
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
Employ a small trusted computing base Grant the least privilege possible
Promote privacy Compartmentalize
Defend in Depth
Use community resources – no security by obscurity
•
• •
•
• •
– –
•
•
•
The Principles
Favor simplicity
Use fail-safe defaults
Do not expect expert users
Trust with reluctance
Employ a small trusted computing base Grant the least privilege possible
Promote privacy Compartmentalize
Defend in Depth
Use community resources – no security by obscurity
Monitor and trace
Classic Advice
The classic reference on principles of secure design is The Protection of Information in Computer Systems, by Saltzer and Schroeder (in 1975)
http://web.mit.edu/Saltzer/www/publications/protection/Basic.html
Classic Advice
The classic reference on principles of secure design is The Protection of Information in Computer Systems, by Saltzer and Schroeder (in 1975)
Principles
• Economy of Mechanism
• Fail-safe Defaults
• Complete mediation
• Open design
• Psychological acceptability
• Separation of privilege
• Least privilege
• Least common mechanism • (Work factor)
• (Compromise recording)
http://web.mit.edu/Saltzer/www/publications/protection/Basic.html
Comparing to our list
•
•
Comparing to our list Several principles reorganized/renamed
Separation of privilege has elements of our compartmentalization, defend in depth
Open design is like use community resources, but did
•
not anticipate open-source code
•
•
•
•
Comparing to our list Several principles reorganized/renamed
Separation of privilege has elements of our compartmentalization, defend in depth
Open design is like use community resources, but did
•
not anticipate open-source code
Monitoring is added
Their focus on prevention of attack, rather than recovery
•
•
•
•
•
•
Comparing to our list Several principles reorganized/renamed
Separation of privilege has elements of our compartmentalization, defend in depth
Open design is like use community resources, but did Monitoring is added
Their focus on prevention of attack, rather than recovery “Principle” of complete mediation dropped
CM not a design principle, but a rather an implementation requirement
•
not anticipate open-source code
Design Category: Favor Simplicity
Favor Simplicity
•
Keep it so simple it is obviously correct
Favor Simplicity
•
•
–
Favor Simplicity
Keep it so simple it is obviously correct
Applies to the external interface, the internal design, and the implementation
Classically referred to as economy of mechanism
•
Keep it so simple it is obviously correct
Classically referred to as economy of mechanism Category: Prevention
Favor Simplicity
Applies to the external interface, the internal design, and
•
the implementation
–
•
Favor Simplicity
Keep it so simple it is obviously correct
Applies to the external interface, the internal design, and Classically referred to as economy of mechanism
Category: Prevention
“We’ve seen security bugs in almost everything: operating systems, applications programs, network hardware and software, and security products themselves. This is a direct result of the complexity of these systems. The more complex a system is–the more options it has, the more functionality it has, the more interfaces it has, the more interactions it has–the harder it is to analyze [its security]”. —Bruce Schneier
•
•
the implementation
–
•
FS: Use fail-safe defaults
FS: Use fail-safe defaults
Some configuration or usage choices affect a
•
system’s security
FS: Use fail-safe defaults
Some configuration or usage choices affect a
system’s security
The length of cryptographic keys
•
•
FS: Use fail-safe defaults
The length of cryptographic keys The choice of a password
Some configuration or usage choices affect a
•
system’s security
• •
FS: Use fail-safe defaults
The length of cryptographic keys The choice of a password
Which inputs are deemed valid
Some configuration or usage choices affect a
•
system’s security
• • •
FS: Use fail-safe defaults
The length of cryptographic keys The choice of a password
Which inputs are deemed valid
The default choice should be a secure one
Default key length is secure (e.g., 2048-bit RSA keys) No default password: cannot run the system without
picking one
Whitelist valid objects, rather than blacklist invalid ones
E.g., don’t render images from unknown sources
Some configuration or usage choices affect a
•
system’s security
• • •
•
• •
•
–
…
…
“… whitelisting on servers and single function servers or appliances has proven to cause near zero business or IT administration disruption”
FS: Do not expect expert users
[Only] Computer scientists and drug dealers have users
—R. David Lankes
FS: Do not expect expert users [Only] Computer scientists and drug dealers have users
—R. David Lankes
Software designers should consider how the mindset users will affect security
•
and abilities of (the least sophisticated of) a system’s
FS: Do not expect expert users [Only] Computer scientists and drug dealers have users
—R. David Lankes
Software designers should consider how the mindset users will affect security
•
and abilities of (the least sophisticated of) a system’s
•
Favor simple user interfaces
FS: Do not expect expert users [Only] Computer scientists and drug dealers have users
—R. David Lankes
Software designers should consider how the mindset users will affect security
•
and abilities of (the least sophisticated of) a system’s
•
•
–
Favor simple user interfaces
Natural or obvious choice is the secure choice
Or avoid choices at all, if possible, when it comes to security
FS: Do not expect expert users [Only] Computer scientists and drug dealers have users
—R. David Lankes
Software designers should consider how the mindset users will affect security
•
and abilities of (the least sophisticated of) a system’s
•
•
–
•
–
Don’t have users make frequent security decisions
Want to avoid user fatigue
Favor simple user interfaces
Natural or obvious choice is the secure choice
Or avoid choices at all, if possible, when it comes to security
FS: Do not expect expert users [Only] Computer scientists and drug dealers have users
—R. David Lankes
Software designers should consider how the mindset users will affect security
•
and abilities of (the least sophisticated of) a system’s
•
•
–
•
–
•
–
Don’t have users make frequent security decisions
Want to avoid user fatigue
Favor simple user interfaces
Natural or obvious choice is the secure choice
Or avoid choices at all, if possible, when it comes to security
Help users explore ramifications of choices
E.g., allow admin to explore user view of set access control policy
Passwords
•
Goal: easy to remember but hard to guess
Passwords
•
•
–
Passwords
Goal: easy to remember but hard to guess
Turns out to be wrong in many cases! Hard to guess = Hard to remember!
•
•
– •
Passwords
Goal: easy to remember but hard to guess
Turns out to be wrong in many cases! Hard to guess = Hard to remember!
Compounding problem: repeated password use
•
•
– •
• • •
John the Ripper, http://www.openwall.com/john/ Project Rainbow, http://project-rainbowcrack.com/ many more …
Passwords
Goal: easy to remember but hard to guess
Turns out to be wrong in many cases! Hard to guess = Hard to remember!
Compounding problem: repeated password use Password cracking tools train on released data to
•
quickly guess common passwords
•
•
– •
• • • •
John the Ripper, http://www.openwall.com/john/ Project Rainbow, http://project-rainbowcrack.com/ many more …
Top 10 worst passwords of 2013:123456, password, 12345678, qwerty, abc123, 123456789, 111111, 1234567, iloveyou, adobe123 [from SplashData]
Passwords
Goal: easy to remember but hard to guess
Turns out to be wrong in many cases! Hard to guess = Hard to remember!
Compounding problem: repeated password use Password cracking tools train on released data to
•
quickly guess common passwords
•
A password manager (PM) stores a database of passwords, indexed by site
Password Manager
•
Password Manager
A password manager (PM) stores a database of passwords,
indexed by site
Encrypted by a single, master password chosen (and
•
remembered) by the user, used as a key
•
Password Manager
A password manager (PM) stores a database of passwords,
indexed by site
PM generates complicated per-site passwords
Hard to guess, hard to remember, but the latter doesn’t matter!
Encrypted by a single, master password chosen (and
•
remembered) by the user, used as a key
•
–
•
Password Manager
A password manager (PM) stores a database of passwords,
indexed by site
Encrypted by a single, master password chosen (and PM generates complicated per-site passwords
Hard to guess, hard to remember, but the latter doesn’t matter!
•
Benefits
•
remembered) by the user, used as a key
•
–
•
Password Manager
A password manager (PM) stores a database of passwords,
indexed by site
Encrypted by a single, master password chosen (and PM generates complicated per-site passwords
Hard to guess, hard to remember, but the latter doesn’t matter!
•
•
•
remembered) by the user, used as a key
•
–
Benefits
Only a single password for user to remember
•
Password Manager
A password manager (PM) stores a database of passwords,
indexed by site
Encrypted by a single, master password chosen (and PM generates complicated per-site passwords
Hard to guess, hard to remember, but the latter doesn’t matter!
•
• •
•
remembered) by the user, used as a key
•
–
Benefits
Only a single password for user to remember User’s password at any given site is hard to guess
•
Password Manager
A password manager (PM) stores a database of passwords,
indexed by site
Encrypted by a single, master password chosen (and PM generates complicated per-site passwords
Hard to guess, hard to remember, but the latter doesn’t matter!
•
• • •
•
remembered) by the user, used as a key
•
–
Benefits
Only a single password for user to remember User’s password at any given site is hard to guess
Compromise of password at one site does not permit immediate compromise at other sites
•
Password Manager
A password manager (PM) stores a database of passwords,
indexed by site
Encrypted by a single, master password chosen (and PM generates complicated per-site passwords
Hard to guess, hard to remember, but the latter doesn’t matter!
•
• • •
•
•
•
remembered) by the user, used as a key
•
–
Benefits
Only a single password for user to remember User’s password at any given site is hard to guess
Compromise of password at one site does not permit immediate compromise at other sites
But:
Must still protect and remember strong master password
Password Strength Meter
Password Strength Meter
Gives user feedback on the strength of the password
•
Password Strength Meter
Gives user feedback on the strength of the password
Intended to measure guessability
•
•
Password Strength Meter
Gives user feedback on the strength of the password
Intended to measure guessability
Research shows that these can work, but the design must
be stringent (e.g., forcing unusual characters)
•
• •
Ur et al, “How does your password measure up? The effect of strength
–
meters on password creation”, Proc. USENIX Security Symposium, 2012.
Better together
•
Password manager
Better together
•
•
Better together
Password manager
One security decision, not many
•
•
•
Better together
Password manager
One security decision, not many Password meter
•
•
•
Better together
Password manager
One security decision, not many Password meter
Users can explore ramifications of various choices
•
by visualizing quality and reasoning of password
•
•
•
Better together
Password manager
One security decision, not many Password meter
Users can explore ramifications of various choices
•
by visualizing quality and reasoning of password
Do not permit poor choices (or reduce the chances
•
of them) by enforcing a minimum score
Phishing
Phishing
User is tricked into thinking that a site or e-mail is
•
legitimate, rather than a scam
Phishing
User is tricked into thinking that a site or e-mail is
•
legitimate, rather than a scam
And is then tricked into installing malware or performing
•
other harmful actions
Phishing
•
Failure: Site or e-mail not (really) authenticated
Phishing
•
•
Phishing
Failure: Site or e-mail not (really) authenticated
Internet e-mail and web protocols not originally designed for remote authentication
•
•
•
Phishing
Failure: Site or e-mail not (really) authenticated
Internet e-mail and web protocols not originally designed for remote authentication
Solution is hard to deploy
•
•
•
Phishing
Failure: Site or e-mail not (really) authenticated
Internet e-mail and web protocols not originally designed for remote authentication
Solution is hard to deploy
Use hard-to-fake notions of identity, like public key cryptography.
–
But which system? How to upgrade gradually?
Design Category: Trust with Reluctance
Trust with Reluctance (TwR)
Trust with Reluctance (TwR)
Whole system security depends on the secure operation of its parts
•
Trust with Reluctance (TwR)
Whole system security depends on the secure
operation of its parts
These parts are trusted
•
•
Trust with Reluctance (TwR)
Whole system security depends on the secure operation of its parts
These parts are trusted
So: Improve security by reducing the need trust
•
•
•
Trust with Reluctance (TwR)
Whole system security depends on the secure operation of its parts
These parts are trusted
So: Improve security by reducing the need trust By using a better design
•
•
•
•
Trust with Reluctance (TwR)
Whole system security depends on the secure operation of its parts
These parts are trusted
So: Improve security by reducing the need trust
By using a better design
By using a better implementation process
•
•
•
•
•
Trust with Reluctance (TwR)
Whole system security depends on the secure operation of its parts
These parts are trusted
So: Improve security by reducing the need trust
By using a better design
By using a better implementation process By not making unnecessary assumptions
•
•
•
•
• •
Trust with Reluctance (TwR)
Whole system security depends on the secure operation of its parts
These parts are trusted
So: Improve security by reducing the need trust
By using a better design
By using a better implementation process
By not making unnecessary assumptions
If you use third party code, how do you know what it does?
•
•
•
•
•
•
–
Trust with Reluctance (TwR)
Whole system security depends on the secure operation of its parts
These parts are trusted
So: Improve security by reducing the need trust
By using a better design
By using a better implementation process
By not making unnecessary assumptions
If you use third party code, how do you know what it does?
Categories: Prevention and mitigation
•
•
•
•
•
•
–
•
If you are not a crypto expert, why do you think you can design/
–
implement your own crypto algorithm?
TwR: Small TCB
•
TwR: Small TCB Keep the TCB small (and simple) to reduce overall
susceptibility to compromise
Category: Prevention
The trusted computing base (TCB) comprises the system
•
components that must work correctly to ensure security
•
•
TwR: Small TCB Keep the TCB small (and simple) to reduce overall
susceptibility to compromise
Category: Prevention
Example: Operating system kernels
Kernels enforce security policies, but are often millions of Compromise in a device driver compromises security overall
Better: Minimize size of kernel to reduce trusted
components
Device drivers moved outside of kernel in micro-kernel designs
•
•
The trusted computing base (TCB) comprises the system
•
components that must work correctly to ensure security
•
lines of code
–
•
–
Failure: Large TCB
Failure: Large TCB
Security software
•
is part of the TCB
Failure: Large TCB
Additional security layers often create vulnerabilities…
October 2010 vulnerability watchlist
Security software
•
is part of the TCB
Vulnerability Title
XXXXXXXXXXXX XXXXXXXXXXXX Local Privilege Escalation Vulnerability
Fix Avail?
No
Date Added
8/25/2010
XXXXXXXXXXXX XXXXXXXXXXXX Denial of Service Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Buffer Overflow Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Sanitization Bypass Weakness
Yes
No
No
8/24/2010
8/20/2010
8/18/2010
XXXXXXXXXXXX XXXXXXXXXXXX Security Bypass Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Remote Code Execution Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Use-After-Free Memory Corruption Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Remote Code Execution Vulnerability
No
No
No
No
8/17/2010
XXXXXXXXXXXX XXXXXXXXXXXX Multiple Security Vulnerabilities
Yes
8/16/2010
8/16/2010
8/12/2010
8/10/2010
XXXXXXXXXXXX XXXXXXXXXXXX Multiple Buffer Overflow Vulnerabilities
XXXXXXXXXXXX XXXXXXXXXXXX Stack Buffer Overflow Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Security-Bypass Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Multiple Security Vulnerabilities
XXXXXXXXXXXX XXXXXXXXXXXX Buffer Overflow Vulnerability
No
Yes
No
No
8/10/2010
8/06/2010
8/05/201
0
6 of the
vulnerabilities
8/09/2010
are in security
s
oftware
No
7/29/2010
XXXXXXXXXXXX XXXXXXXXXXXX Remote Privilege Escalation Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Cross Site Request Forgery Vulnerability
XXXXXXXXXXXX XXXXXXXXXXXX Multiple Denial Of Service Vulnerabilities
No
No
No
7/28/2010
7/26/2010
7/22/2010
•
But as it grows in size and complexity, it becomes vulnerable itself, and can be bypassed
Vendor Replied – Fix in development
Awaiting Vendor Reply/Confirmation
Awaiting CC/S/A use validation
Color Code Key:
Approved for Public Release, Distribution Unlimited
http://www.darpa.mil/WorkArea/DownloadAsset.aspx?id=2147484449
•
•
TwR: Least Privilege Don’t give a part of the system more privileges than
it needs to do its job (“need to know”)
Category: Mitigation
TwR: Least Privilege Don’t give a part of the system more privileges than
•
it needs to do its job (“need to know”)
•
•
Category: Mitigation
Example: Attenuate delegations
TwR: Least Privilege Don’t give a part of the system more privileges than
•
it needs to do its job (“need to know”)
Category: Mitigation
Example: Attenuate delegations
Mail program delegates to editor for authoring mails
vi, emacs
•
•
•
–
TwR: Least Privilege Don’t give a part of the system more privileges than
•
it needs to do its job (“need to know”)
Category: Mitigation
Example: Attenuate delegations
Mail program delegates to editor for authoring mails
vi, emacs
•
•
•
–
But many editors permit escaping to a command shell
•
to run arbitrary programs: too much privilege!
TwR: Least Privilege Don’t give a part of the system more privileges than
•
it needs to do its job (“need to know”)
Category: Mitigation
Example: Attenuate delegations
Mail program delegates to editor for authoring mails
vi, emacs
Better Design: Use a restricted editor (pico)
•
•
•
–
But many editors permit escaping to a command shell
•
to run arbitrary programs: too much privilege!
•
Lesson: Trust is Transitive
Lesson: Trust is Transitive
If you trust something, you trust what it trusts
•
Lesson: Trust is Transitive
If you trust something, you trust what it trusts
This trust can be misplaced
•
•
Lesson: Trust is Transitive
If you trust something, you trust what it trusts
This trust can be misplaced
Previous e-mail client example
•
•
•
Lesson: Trust is Transitive
If you trust something, you trust what it trusts
This trust can be misplaced
Previous e-mail client example
Mailer delegates to an arbitrary editor
•
•
•
•
Lesson: Trust is Transitive
If you trust something, you trust what it trusts
This trust can be misplaced
Previous e-mail client example
Mailer delegates to an arbitrary editor The editor permits running arbitrary code
•
•
•
• •
Lesson: Trust is Transitive
If you trust something, you trust what it trusts
This trust can be misplaced
Previous e-mail client example
Mailer delegates to an arbitrary editor
The editor permits running arbitrary code Hence the mailer permits running arbitrary code
•
•
•
• • •
Rule: Input validation
•
Rule: Input validation Input validation is a kind of least privilege
•
•
Rule: Input validation Input validation is a kind of least privilege
You are trusting a subsystem only under certain circumstances
•
•
•
Rule: Input validation Input validation is a kind of least privilege
You are trusting a subsystem only under certain circumstances
Validate that those circumstances hold
•
•
•
•
Rule: Input validation Input validation is a kind of least privilege
You are trusting a subsystem only under certain circumstances
Validate that those circumstances hold
Several examples so far:
•
•
•
•
Rule: Input validation Input validation is a kind of least privilege
You are trusting a subsystem only under certain circumstances
Validate that those circumstances hold
Several examples so far:
Trust a given function if the range of its parameters is
•
limited (e.g., within the length of a buffer)
•
•
•
•
Rule: Input validation Input validation is a kind of least privilege
You are trusting a subsystem only under certain circumstances
Validate that those circumstances hold
Several examples so far:
Trust a given function if the range of its parameters is
•
limited (e.g., within the length of a buffer)
Trust a client form field if it contains no