程序代写代做代考 algorithm database finance game concurrency C graph CS306: Introduction to IT Security Fall 2020

CS306: Introduction to IT Security Fall 2020
Lecture 2: Symmetric-key Encryption
Instructor: Nikos Triandopoulos September 8, 2020

2
2.0 Announcements

CS306: Staff
u Instructor
u Nikos Triandopoulos, ntriando@stevens.edu
u course organization / management, lectures, assignments, grades, …
u all mistakes will be also mineJ
u office hours: Thursdays 1 – 2pm (Zoom ID 91463728672) or by appointment
u Teaching assistants
u Dean Rodman (drodman@stevens.edu), Devharsh Trivedi (dtrived5@stevens.edu), Joseph Iervasi (jiervasi@stevens.edu), Mohammad Khan (mkhan13@stevens.edu), Joshua Mimer (jmimer@stevens.edu), Uday Samavenkata (usamaven@stevens.edu)
u assistance w/ labs, assignments, help sessions, grading, demos, … 3
NEW

CS306: TA hours
u Standard schedule, starting from tomorrow SAME ZOOM ID
Day
Monday
Tuesday
Wednesday
Thursday
Friday
time
13:00 – 14:00
13:00 – 14:00
13:00 – 14:00
13:00 – 14:00
13:00 – 14:00
Zoom ID
91463728672
91463728672
91463728672
91463728672
91463728672
staff
Dean
Joshua
Joseph
Nikos
Uday
u Additional TA hours to be added for homework assignments or before exams 4

CS306: Lectures & labs
CS306 is offered in 2 required sessions, each offered in multiple sections u lectures
u CS306-A
u CS306-B u labs
u CS306-Lx
Tue 2:00pm – 4:30pm Tue 6:30pm – 9:00pm
Thursdays
Online Online
67 / 69 63 / 69
last week
x
A
B
C
D
E
F
time
8 – 8:50
9:30 – 10:20
11:00 – 11:50
12:30 – 13:20
2:00 – 2:50
3:30 – 4:20
enrollment
1
18
29
29
29
24
5

CS306: Lectures & labs (continued)
CS306 is offered in 2 required sessions, each offered in multiple sections u lectures
u CS306-A
u CS306-B u labs
u CS306-Lx
Tue 2:00pm – 4:30pm Tue 6:30pm – 9:00pm
Thursdays
Online Online
67 / 69 62 / 69
this week
x
A
B
C
D
E
F
time
8 – 8:50
9:30 – 10:20
11:00 – 11:50
12:30 – 13:20
2:00 – 2:50
3:30 – 4:20
enrollment
1
17
29
28
28
26
6

CS306: Lectures & labs (continued)
u Lecture/lab sections will cover the same materials
u Changes in lecture or lab sections
u allowed (if need be) but generally discouraged (for planning purposes)
u In any case, if a section change is necessary
u students must let the TAs or instructor know well in advance
7

Our on-going semester-long project…
u Lectures take place in 2.5h slots
u CS306-A Tue 2:00pm – 4:30pm Online 67 / 69 u CS306-B Tue 6:30pm – 9:00pm Online 62 / 69
u Highly problematic & undesirable for both students & instructor u unfortunately unavoidable due to existing scheduling restrictions
u Tentative countermeasures
u two ~10-min breaks
u Spending last 30min with demos, special topics of interest or offline materials
Please provide suggestions on what can make class experience better despite 2.5h lectures
8

CS306: Lab sections schedule
u labs
u CS306-Lx Thursdays
ZOOM ID: LAB SPECIFIC!
x
B
C
D
E
F
time
9:30 – 10:20
11:00 – 11:50
12:30 – 13:20
14:00 – 14:50
15:30 – 16:20
Zoom ID
91573945614
93061161569
94976630644
92834271191
94520991826
TAs
Dean, Joseph, Joshua, Uday
Dean, Devharsh, Joseph, Joshua
Dean/Devharsh, Joshua, Mohammad, Uday
Devharsh, Joseph, Mohammad, Uday
Dean, Joseph, Mohammad, Uday
9

CS306: Other announcements
u Canvas course materials are now updated u Lab sessions start this week
u TA hours & office hours start tomorrow
10

CS306: Tentative Syllabus
Week
Date
Topics
Reading
Assignment
1
Sep 1
Introduction
Lecture 1

2
Sep 8
Symmetric-key crypto I
3
Sep 15
Symmetric-key crypto II
4
Sep 22
Public-key crypto I
5
Sep 29
Public-key crypto II
6
Oct 6
Access control & authentication

Oct 13
No class (Monday schedule)
7
Oct 20
Midterm
All materials covered
11

CS306: Tentative Syllabus
(continued)
Week
Date
Topics
Reading
Assignment
8
Oct 27
Software & Web security
9
Nov 3
Network security
10
Nov 10
Database security
11
Nov 17
Cloud security
12
Nov 24
Privacy
13
Dec 1
Economics
14
Dec 8
Legal & ethical issues
15
Dec 10 (or later)
Final
(closed “books”)
All materials covered*
12
* w/ focus on what covered after midterm

CS306: Course outcomes
u Terms
u describe common security terms and concepts
u Cryptography
u state basics/fundamentals about secret and public key cryptography concepts
u Attack & Defense
u acquire basic understanding for attack techniques and defense mechanisms
u Impact
u acquire an understanding for the broader impact of security and its integral connection to other fields in computer science (such as software engineering, databases, operating systems) as well as other disciplines including STEM, economics, and law
u Ethics
u acquire an understanding for ethical issues in cyber-security 13

Questions?
u Please ask questions during class!
14

Last week
u Course logistics
u topic of study, enrollment eligibility, sessions
u staff, learning materials, course organization
u expectations, grading, policies, announcements u syllabus overview, course objectives/outcomes
u Introduction to the field of IT security
u in-class discussion with a real-world example
15

Today
u Introduction to the field of IT security u Basic concepts and terms
u Symmetric encryption
16

17
2.1 Basic security concepts & terms

What is IT security?
IT security is the prevention of, or protection against
u access to information by unauthorized recipients
u intentional but unauthorized destruction or alteration of that information
Definition from: Dictionary of Computing, Fourth Ed. (Oxford: Oxford University Press 1996).
IT security (informal definition)
u theprotectionofinformationsystemsfrom
u theftordamagetothehardware,thesoftware,andtotheinformationonthem, as well as from disruption or misdirection of the services they provide
u anypossiblethreat
18

The ‘IT-security’ game: What’s at stake?
u Computer systems comprise assets that have (some) value
u e.g., laptops store vast personal or important information (files, photos, email, …) u personal, time dependent and often imprecise (e.g., monetary Vs. emotional)
u Valuable assets deserve security protection u to preserve their value,
expressed as a security property
u e.g., personal photos should always be accessible by their owner
u or to prevent (undesired) harm examined as a concrete attack
u e.g., permanent destruction of irreplaceable photos 19

The ‘IT-security’ game: Who are the players?
u Defenders
u system owners (e.g., users, administrators, etc.)
u seek to enforce one or more security properties or defeat certain attacks
u Attackers
u external entities (e.g., hackers, other users, etc.)
u seek to launch attacks that break a security property or impose the system to certain threats
property-based view
attack-based view
20

Security properties
u General statements about the value of a computer system u Examples
u The C-I-A triad
u confidentiality, integrity, availability
u (Some) other properties
u authentication / authenticity
u non-repudiation / accountability / auditability u anonymity
21

The C-I-A triad
u Captures the three fundamental properties that make any system valuable Computer security seeks to prevent unauthorized viewing (confidentiality)
or modification (integrity) of data while preserving access (availability)
22

Confidentiality
u An asset is viewed only by authorized parties
u e.g., conforming to originally-prescribed “read” rules
via access control u some other tools
u encryption, obfuscation, sanitization, …
Policy:
Who + What + How = Yes/No
Object (what)
Mode of access (how)
Subject (who)
23

Integrity
u An asset is modified only by authorized parties
u beyond conforming to originally-prescribed “write” access-control rules
u precise, accurate, unmodified, modified in acceptable way by authorized people or processes, consistent, meaningful and usable
u authorized actions, separation & protection of resources, error detection & correction u some tools
u hashing, MACs
24

Availability
u An asset can be used by any authorized party
u usable, meets service’s needs, bounded waiting/completion time, acceptable outcome u timely response, fairness, concurrency, fault tolerance, graceful cessation (if needed) u some tools
u redundancy, fault tolerance, distributed architectures
25

Authenticity
u The ability to determine that statements, policies,
and permissions issued by persons or systems are genuine
u some tools
u digital signatures (cryptographic computations that allow entities to commit
to the authenticity of their documents in a unique way)
u achieve non-repudiation (authentic statements issued by some person or system cannot be denied)
26

Anonymity
u The property that certain records/transactions cannot be attributed to any individual
u some tools
u aggregation
u disclosure of statistics on combined data from many individuals that cannot be tied to any individual
u proxies
u trusted agents interacting on behalf on an individual in untraceable way
u pseudonyms
u fictional identities, known only to a trusted party, that fill in for real identities
27

Discussion
1. Cloud-based storage
2. e-banking
u What is a valued asset?
u What does it mean to preserve this value?
u What is a corresponding desired security property? u What is a harm that must be prevented?
28

The “Vulnerability – Threat – Control” paradigm
u A vulnerability is a weakness that could be exploited to cause harm u A threat is a set of circumstances that could cause harm
u A security control is a mechanism that protects against harm
u i.e., countermeasures designed to prevent threats from exercising vulnerabilities Thus
u Attackers seek to exploit vulnerabilities in order to impose threats
u Defenders seek to block these threats by controlling the vulnerabilities
29

A “Vulnerability – Threat – Control” example
30

Example of threat
u Eavesdropping: the interception of information intended for someone else during its transmission over a communication channel
31

Example of threat
u Alteration: unauthorized modification of information
u Example: the man-in-the-middle attack, where a network stream
is intercepted, modified, and retransmitted
32

Example of threat
u Denial-of-service: the interruption or degradation of a data service or information access
u Example: email spam,
to the degree that it is meant to simply fill up a mail queue and slow down an email server
33

Examples of threats
u Masquerading: the fabrication of information that is purported to be from someone
who is not actually the author
u e.g., IP spoofing attack: maliciously altering the source IP address of a message
u Repudiation: the denial of a commitment or data receipt
u this involves an attempt to back out of a contract/protocol that, e.g., requires the different parties to provide receipts acknowledging that data has been received
34

Example of vulnerability
u Software bugs: Code is not doing what is supposed to be doing
u Example: Some application code is mistakenly using an algorithm for encryption that has been broken
u Example: There is no checking of array bounds
35

An hard-to-win game: Varied threats
Threats
u from natural to human
u from benign to malicious
u from random to targeted (APTs)
Threats
36
Examples: Fire, power failure
Natural causes
Human causes
Benign intent
Malicious intent
Example: Human error
Random
Directed
Example: Malicious code on a general web site
Example: Impersonation

A hard-to-win game: Unknown enemy
Attackers
u beyond isolated “crazy” hackers u organized groups/crime
u may use computer crime (e.g., stealing CC#s) in order to finance other crimes
u terrorists
u computers/assets as target,
method, enabler, or enhancer
Terrorist
37
Hacker
Criminal- for-hire
Individual
Loosely connected group
Organized crime member

A hard-to-win game: Choose your battle
Risk management u choose priorities
u which threats to control
u estimate possible harm & impact
u what / how many resources to devote
u estimate solution cost & protection level
u consider trade-offs balancing cost Vs. benefit u compute the residual risk
u decide on transfering risk or doing nothing Never a “one-shot” game
Kind of Threat
Confidentialit
y
Integrity
Availability
Protects
38
Human/not Malicious/not
Directed/not
Physical
Procedural Technical
Control Type

A hard-to-win game: Best-effort approach
Deciding on controls relies on incomplete information
u likelihood of attack and impact of possible harm is impossible to measure perfectly u full set of vulnerabilities is often unknown
u weak authentication, lack of access control, errors in programs, etc. u system’s attack surface is often too wide
u physical hazards, malicious attacks, stealthy theft by insiders, benign mistakes, impersonations, etc.
A useful strategy: The “method – opportunity – motive” view of an attack
u deny any of them and the attack will (likely) fail 39

A hard-to-win game: Best-effort approach (continued)
Controls offer a wide range of protection level / efficacy
u they counter or neutralize threats or remove vulnerabilities in different ways
Types of controls
u prevent (attack is blocked)
Preemption
External Prevention
System Perimeter
Internal Prevention
System Resource
u deter (attack becomes harder)
u deflect (change target of attack)
u mitigate (make impact less severe) u contain (stop propagation of harm) u detect (real time/after the fact)
u recover (from its effects)
Detection
OIIIO IIOIO
Internal Deterrence
Intrusion Attempts
Response Deflection
External Deterrence
Hard to balance cost/effectiveness of controls with likelihood/severity of threats
40
Faux Environment

Example of control: HTTPS protocol
Hypertext Transfer Protocol Secure (HTTPS) u Confidentiality
u Integrity
u Availability
u Authenticity u Anonymity
41

Example of control: RAID technology
Redundant Array of Independent Disks (RAID)
u Confidentiality u Integrity
u Availability
u Authenticity
u Anonymity
42

Example of controls: TOR protocol
u Confidentiality u Integrity
u Availability
u Authenticity
u Anonymity
43

As we will see: Exciting times to study (or work in) IT Security!
Relevance to practice & real-world importance
u plethora of real-world problems & real needs for security solutions
u combination of different research areas within CS and across other fields u multi-dimensional topic of study
u protocol design, system building, user experience, social/economic aspects u wide range of perspectives
u practical / systems – foundations / theory, attacker’s Vs. defender’s view
44

45
2.2 Symmetric-key encryption

Recall: Confidentiality
Fundamental security property
u an asset is viewed only by authorized parties u “C” in the CIA triad
“computer security seeks to prevent unauthorized viewing (confidentiality) or modification (integrity) of data while preserving access (availability)”
Eavesdropping
u main threat against confidentiality of in-transit data
defender
defender
46
attacker

Problem setting: Secret communication
Two parties wish to communicate over a channel
u Alice (sender/source) wants to send a message m to Bob (recipient/destination) Underlying channel is unprotected
u Eve (attacker/adversary) can eavesdrop any sent messages
u e.g., packet sniffing over networked or wireless communications
Eve Alice m m
m Bob
47

Solution concept: Symmetric-key encryption
Main idea
u secretly transform message so that it is unintelligible while in transit
u Alice encrypts her message m to ciphertext c, which is sent instead of plaintext m u Bob decrypts received message c to original message m
u Eve can intercept c but “cannot learn” m from c
u Alice and Bob share a secret key k that is used for both message transformations
kEve k
Alicemencrypt c
c mBob
decrypt
48

Security tool: Symmetric-key encryption scheme
Abstract cryptographic primitive, a.k.a. cipher, defined by u a message space M; and
u a triplet of algorithms (Gen, Enc, Dec)
u Gen, Enc are probabilistic algorithms, whereas Dec is deterministic u Gen outputs a uniformly random key k (from some key space K)
M: set of possible messages
Gen
kEve k
Alice m Enc c
c’ m’Bob
Dec
49

Desired properties for symmetric-key encryption scheme
By design, any symmetric-key encryption scheme should satisfy the following
u efficiency: u correctness: u security:
M: set of possible messages
key generation & message transformations “are fast” for all m and k, it holds that Dec( Enc(m, k) , k) = m one “cannot learn” plaintext m from ciphertext c
kEve k
Gen
AlicemEncc c mBob 50
Dec

Kerckhoff’s principle
“The cipher method must not be required to be secret, and it must be able to fall into the hands of the enemy without inconvenience.”
Reasoning
u due to security & correctness, Alice & Bob must share some secret info
u if no shared key captures this secret info, it must be captured by Enc, Dec u but keeping Enc, Dec secret is problematic
u harder to keep secret an algorithm than a short key (e.g., after user revocation)
u harder to change an algorithm than a short key (e.g., after secret info is exposed)
u riskier to rely on custom/ad-hoc schemes than publicly scrutinized/standardized ones
51

Symmetric-key encryption
u Also referred to as simply “symmetric encryption”
Key Key
(Optional) (Optional)
Plaintext Encryption Ciphertext Decryption
52
Original Plaintext

Symmetric Vs. Asymmetric encryption
Key
Plaintext
Encryption
Encryption Key
Ciphertext Decryption (a) Symmetric Cryptosystem
Original Plaintext
Decryption Key
Plaintext
Encryption
Ciphertext
Decryption
(b) Asymmetric Cryptosystem
53
Original Plaintext

Main application areas
Secure communication Secure storage
u encrypt messages sent among parties u encrypt files outsourced to the cloud
u assumption
u Alice and Bob securely generate,
distribute & store shared key k
u attacker does not learn key k
kEve kk Eve
Alice Bob
messages
Alice
files
54
u assumption
u Alice securely generates & stores key k
u attacker does not learn key k

Brute-force attack
Generic attack
u given a captured ciphertext c and known key space K, Dec u strategy is an exhaustive search
u for all possible keys k in K
u determine if Dec (c,k) is a likely plaintext m
u requires some knowledge on the message space M
u i.e., structure of the plaintext (e.g., PDF file or email message) Countermeasure
u key should be a random value from a sufficiently large key space K to make exhaustive search attacks infeasible
55

56
2.3 Classical ciphers

Substitution ciphers
Large class of ciphers
u each letter is uniquely replaced by another u there are 26! possible substitution ciphers
u e.g., one popular substitution “cipher” for some Internet posts is ROT13
u historically
u all classical ciphers are of this type
57

General structure of classical ciphers
Based on letter substitution
u message space M is “valid words” from a given alphabet
u e.g., English text without spaces, punctuation or numerals
u characters can be represented as numbers in [0:25] u encryption
u mapping each plaintext character into another character
u character mapping is typically defined as a “shift” of a plaintext character by a
number of positions in a canonical ordering of the characters in the alphabet u character shifting occurs with “wrap-around” (using mod 26 addition)
u decryption
u undo character shifting with “wrap-around” (using mod 26 subtraction)
58

Limitations of substitution ciphers
Generally, susceptible to frequency (and other statistical) analysis u letters in a natural language, like English, are not uniformly distributed u cryptographic attacks against substitution ciphers are possible
u e.g., by exploiting knowledge of letter frequencies, including pairs and triples
59

Letter frequency in (sufficiently large) English text
60

Classical ciphers – examples
Caesar’s cipher
u shift each character in the message by 3 positions u or by 13 positions in ROT-13
u cryptanalysis
u no secret key is used – based on “security by obscurity”
u thus the code is trivially insecure once knows Enc (or Dec)
61

Classical ciphers – examples (II)
Shift cipher
u keyed extension of Caesar’s cipher u randomly set key k in [0:25]
u shift each character in the message by k positions u cryptanalysis
u brute-force attacks are effective given that
u key space is small (26 possibilities or, actually, 25 as 0 should be avoided) u message space M is restricted to “valid words”
u e.g., corresponding to valid English text 62