程序代写代做代考 graph C database flex algorithm CS306: Introduction to IT Security Fall 2020

CS306: Introduction to IT Security Fall 2020
Lecture 4: Ciphers in Practice (I)
Instructor: Nikos Triandopoulos September 22, 2020

2
4.0 Announcements

CS306: Other announcements
u Homework assignment HW1 is out
u due in almost two weeks: Friday, October 2
u covering perfect secrecy, classical ciphers and OTP
u relevant materials
u Lectures 2.2, 2.3, 3.1, 3.2, lecture 4.demo u Lab 2
u please
u start early
u ask for help if needed
u respect the non-collaboration policy
3

CS306: Tentative Syllabus
Week
Date
Topics
Reading
Assignment
1
Sep 1
Introduction
Lecture 1

2
Sep 8
Symmetric-key encryption
Lecture 2
Lab 1
3
Sep 15
Perfect secrecy
Lecture 3
Lab 2, HW 1
4
Sep 22
Public-key crypto I
5
Sep 29
Public-key crypto II
6
Oct 6
Access control & authentication

Oct 13
No class (Monday schedule)
7
Oct 20
Midterm
All materials covered
4

CS306: Tentative Syllabus
(continued)
Week
Date
Topics
Reading
Assignment
8
Oct 27
Software & Web security
9
Nov 3
Network security
10
Nov 10
Database security
11
Nov 17
Cloud security
12
Nov 24
Privacy
13
Dec 1
Economics
14
Dec 8
Legal & ethical issues
15
Dec 10 (or later)
Final
(closed “books”)
All materials covered*
5
* w/ focus on what covered after midterm

Last week
u Symmetric-key Cryptography u Perfect secrecy
u The One-Time Pad cipher
u Demo
u Why encryption matters?
u Using the Wireshark packet analyser
6

Today
u Ciphers in practice
u The big picture
u Computational security u Pseudo-randomness
u stream ciphers, pseudorandom generators
u Demo
u The Caesar and Vigenère ciphers and their cryptanalysis u Pseudo-randomness in practice
7

8
4.1 Introduction to modern cryptography

Cryptography / cryptology
u Etymology u two parts:
u original meaning: u English translation: u meaning:
“crypto” + “graphy” κρυπτός + γράφω secret + write secret writing
/ “logy”
/ λόγος (in Greek) / speech, logic
/ the study of secrets
u Historically developed/studied for secrecy in communications u message encryption in the symmetric-key setting
u main application area: use by military and governments
9

Classical Vs. modern cryptography
antiquity – ~70s
“the art of writing and solving codes”
~80s – today
“the study of mathematical techniques for securing information, systems, and distributed computations against adversarial attacks”
u approach
u systematic design & analysis
u formal notions of security (or adversary) u rigorous proofs of security (or insecurity)
u approach
u ad-hoc design
u trial & error methods u empirically evaluated
10

Example: Classical Vs. modern cryptography for encryption
antiquity – ~70s
“the art of writing and solving codes”
u ad-hoc study
u vulnerabilities/insecurity of
u Caesar’s cipher
u shift cipher
u mono-alphabetic substitution cipher u Vigenère cipher
~80s – today
“the study of mathematical techniques for securing information, systems, and distributed computations against adversarial attacks”
u rigorous study
u problem statement: secret communication over
insecure channel
u abstract solution concept: symmetric encryption, Kerckhoff’s principle, perfect secrecy
u concrete solution & analysis: OTP cipher, proof of security
11

Example: Differences of specific ciphers
Caesar’s/shift/mono-alphabetic cipher
u substitution ciphers u Caesar’s cipher
u shift is always 3 u shift cipher
u shift is unknown and
the same for all characters
u mono-alphabetic substitution/Vigènere cipher u shift is unknown and
the same for all/many character occurrences
12
The one-time pad
u also, a substitution cipher u shift is unknown and
independent for each character occurrence

Formal treatment in modern cryptography
Problem is formulated as an abstract crypto primitive
u captures the essence of the problem at hand, provides clarity and focus Design & evaluation of crypto primitives follows a systematic process
u (A) formal definitions
u (B) precise assumptions
u (C) provable security
(what it means for a crypto primitive to be secure?) (which forms of attacks are allowed – and which aren’t?)
(why a candidate solution is secure – or not)?
13

(A) Formal definitions
abstract but rigorous description of security problem
u computing setting
u involved parties, communication model, core functionality
u underlying cryptographic scheme
u e.g., symmetric-key encryption scheme
u desired properties
u security related
u non-security related
u e.g., correctness, efficiency, etc.
(to be considered) (to be designed) (to be achieved)
14

(A) Why formal definitions are important?
u successful project management
u good design requires clear/specific security goals
u helps to avoid critical omissions or over engineering u provable security
u rigorous evaluation requires a security definition u helps to separate secure from insecure solutions
u qualitative analysis/modular design
u thorough comparison requires an exact reference
u helps to secure complex computing systems 15

Example: Problem at hand
abstract but rigorous description of security problem (to be solved) secret communication
Insecure channel
16

Example: Formal definitions (1)
u computing setting (to be considered) u e.g., involved parties, communication model, core functionality
Alice, Bob, Eve
Alice wants to send a message m to Bob; Eve can eavesdrop sent messages
Alice/Bob may transform the transmitted/received message and share info
Eve
Alice m
Bob
17

Example: Formal definitions (2)
u underlying cryptographic scheme (to be designed) symmetric-key encryption scheme
u Alice and Bob share and use a key k
u Alice encrypts plaintext m to ciphertext c and sends c instead of m u Bob decrypts received c to get a message m’
kEve k
Alicemencrypt c
c mBob
decrypt
18

Example: Formal definitions (3)
u desired properties (to be achieved)
u security (informal)
u correctness (informal)
Eve “cannot learn” m (from c)
If Alice encrypts m to c, then Bobs decrypts c to (the original message) m
kEve k
Alicemencrypt c
c mBob
decrypt
19

Example: Formal definitions (4)
Perfect correctness
u for any k ∈ K , m ∈ M and any ciphertext c output of Enck(m), it holds that Pr[Deck (c)=m]=1
Perfect security (or information-theoretic security)
u the adversary should be able to learn no additional information on m
random experiment DM → m
DK→k Enck(m) → c
Eve
m= attack w/prob.0.8 no attack w/ prob. 0.2
20
Eve’s view remains Eve
the same!
c
m= attack w/prob.0.8
no attack w/ prob. 0.2

(B) Precise assumptions
precise description of all relevant problem components
u adversary / attacker
u type of attacks – a.k.a. threat model
u capabilities (e.g., a priori knowledge, access to information, party corruptions) u limitations (e.g., bounded memory, passive Vs. active)
u computational assumptions (about hardness of certain tasks) u e.g., factoring of large composite numbers is hard
u computing setting
u system set up, initial state, key distribution, randomness… u means of communication (e.g., channels, rounds, …)
u timing assumptions (e.g., synchronicity, epochs, …)
21

(B) Why precise assumptions are important?
u basis for proofs of security
u security holds under specific assumptions
u comparison among possible solutions u relations among different assumptions
u stronger/weaker (i.e., less/more plausible to hold), “A implies B” or “A and B are equivalent” u refutable Vs. non-refutable
u flexibility (in design & analysis)
u validation – to gain confidence or refute
u modularity – to choose among concrete schemes that satisfy the same assumptions
u characterization – to identify simplest/minimal/necessary assumptions 22

Example: Precise assumptions (1)
u adversary
u type of attacks – a.k.a. threat model
u capabilities (e.g., a priori knowledge, access to information, party corruptions) u limitations (e.g., bounded memory, passive Vs. active)
Eve may know the a priori distribution of messages sent by Alice
Eve doesn’t know/learn the secret k (shared by Alice and Bob)
kEve k
eavesdropping
Alicemencrypt c
c mBob
decrypt
23

Example: Precise assumptions (2)
u computational assumptions (about hardness of certain tasks) u e.g., factoring of large composite numbers is hard
no computational assumptions
– a.k.a. perfect secrecy (or information-theoretic security)
kEve k
Alicemencrypt c
c mBob
decrypt
24

Example: Precise assumptions (3)
u computing setting
u system set up, initial state, key distribution, randomness…
u means of communication (e.g., channels, rounds, messages…) u timing assumptions (e.g., synchronicity, epochs, …)
key k is generated randomly using the uniform distribution
key k is securely distributed to and securely stored at Alice and Bob
one message m is only communicated k, m are chosen independently (for simplicity in our initial security definition)
kEve k
Alicemencrypt c
c mBob
decrypt
25

Possible eavesdropping attacks (I)
An attacker may possess a
u collectionofciphertexts
u ciphertext only attack (or simply EAV)
plaintext
Hi, Bob. Don’t invite Eve to the party! Love, Alice
ciphertext
Encryption Algorithm
key
Plain thread model
Eve
26

Possible eavesdropping attacks (II)
An attacker may possess a
u collectionofplaintext/ciphertextpairs u known plaintext attack
plaintext
Hi, Bob. Don’t invite Eve to the party! Love, Alice
ciphertext
27
Encryption Algorithm
key
Eve

Possible eavesdropping attacks (III)
An attacker may possess a
u collectionofplaintext/ciphertextpairs for plaintexts selected by the attacker
u chosen plaintext attack (CPA)
plaintext
ABCDEFG HIJKLMNO PQRSTUV WXYZ.
ciphertext
Encryption Algorithm
key
Advanced threat model 28
Eve

Possible eavesdropping attacks (IV)
An attacker may possess a
u collectionofplaintext/ciphertextpairs for ciphertexts selected by the attacker
u chosen ciphertext attack (CCA)
plaintext
IJCGA, CAN DO HIFFA GOT TIME.
Decryption Algorithm
ciphertext
001101
110111
key
29
Eve

Main security properties against eavesdropping
“plain” security
u protects against ciphertext-only attacks
plaintext
Hi, Bob. Don’t invite Eve to the party! Love, Alice
ciphertext
Encryption Algorithm
“advanced” security
u protects against chosen plaintext attacks 30
plaintext
ABCDEFG HIJKLMNO PQRSTUV WXYZ.
ciphertext
key
Eve
Encryption Algorithm
key
Eve

(C) Provably security
Security
u subject to certain assumptions, a scheme is proved to be secure according to a
specific definition, against a specific adversary u in practice the scheme may break if
u some assumptions do not hold or the attacker is more powerful Insecurity
u a scheme is proved to be insecure with respect to a specific definition u it suffices to find a counterexample attack
31

(C) Why provable security is important?
Typical performance
u in some areas of computer science formal proofs may not be essential
u behavior of hard-to-analyze algorithms is simulated to experimentally study their performance on “typical” inputs
u in practice, typical/average case occurs
Worst case performance
u in cryptography and secure protocol design formal proofs are essential
u “experimental” security analysis is not possible u the notion of a “typical” adversary makes little
sense and is unrealistic
u in practice, worst case attacks will occur
u an adversary will use any means in its power to break a scheme
32

33
4.2 Computational security

The big picture: OPT is perfect but impractical!
We formally defined and constructed the perfectly secure OTP cipher
u This scheme has some major drawbacks
u it employs a very large key which can be used only once!
u Such limitations are unavoidable and make OTP not practical u why?
Now, what?
34

Our approach: Relax “perfectness”
Initial model
u the perfect secrecy (or security) requires that
u the ciphertext leaks absolutely no extra information about the plaintext u to adversaries of unlimited computational power
Refined model
u a relaxed notion of security, called computational security, requires that
u the ciphertext leaks a tiny amount of extra information about the plaintext u to adversaries with bounded computational power
35

Computational security
u to be contrasted against information-theoretic security
u de facto way to model security in most settings
u an integral part of modern cryptography w/ rigorous mathematical proofs
u entails two relaxations
u security is guaranteed against efficient adversaries
u if an attacker invests in sufficiently large resources, it may break security
u goal: make required resources larger than those available to any realistic attacker! u security is guaranteed in a probabilistic manner
u with some small probability, an attacker may break security
u goal: make attack probability sufficiently small so that it can be practically ignored! 36

Towards a rigorous definition of computational security
Concrete approach
u “A scheme is (t,ε)-secure if any attacker A, running for time at most t, succeeds in
breaking the scheme with probability at most ε” Asymptotic approach
u “A scheme is secure if any efficient attacker A succeeds in breaking the scheme with at most negligible probability”
37

Examples
u almost optimal security guarantees
u if key length n, the number of possible keys is 2n
u attacker running for time t succeeds w/ prob. at most ~ t/2n (brute-force attack)
u if n = 60, security is enough for attackers running a desktop computer u 4 GHz (4×109 cycles/sec), checking all 260 keys require about 9 years
u if n = 80, a supercomputer would still need ~2 years
u today’s recommended security parameter is at least n = 128
u large difference between 280 and 2128; e.g., #seconds since Big Bang is ~258
u a once-in-100-years event corresponds to probability 2-30 of happening at a particular sec
u if within 1 year of computation attack is successful w/ prob. 1/260 then it is more likely that Alice and Bob are hit by lighting
38

39
4.3 Symmetric encryption, revisited: Security

Three equivalent “looks” of perfect secrecy
1) a posteriori = a priori
For every DM, m ∈ M and c ∈ C, for which Pr [C = c ] > 0, it holds that
Pr[ M = m | C = c ] = Pr[ M = m ] 3) indistinguishability
For every A, it holds that
Pr[ b’ = b ] = 1/2
2) C is independent of M
For every m, m’ ∈ M and c ∈ C, it holds that
Pr[ EncK(m) = c ] = Pr[ EncK(m’) = c ]
m0, m1 cb
b’
T DK → k
{0, 1} → b Enck(mb) → cb
A
ciphertext looks completely random
40

Security relaxation
Perfect security: M, EncK(M) are independent, unconditionally u no extra information is leaked to any attacker
Computational security: M, EncK(M) are independent, for all practical purposes
u no extra information is leaked but a tiny amount
u e.g., with prob. 2-128 (or much less than the likelihood of being hit by lighting)
u to computationally bounded attackers
u e.g., who cannot count to 2128 (or invest work of more than one century)
u attacker’s best strategy remains ineffective
u random guess a secret key or exhaustive search over key space (brute-force attack)
41

Recall: Main security properties against eavesdropping
“plain” security
u protects against ciphertext-only attacks
plaintext
Hi, Bob. Don’t invite Eve to the party! Love, Alice
ciphertext
Encryption Algorithm
“advanced” security
u protects against chosen plaintext attacks 42
plaintext
ABCDEFG HIJKLMNO PQRSTUV WXYZ.
ciphertext
key
Eve
Encryption Algorithm
key
Eve

Computational EAV-security or indistinguishability
Relax the definition of perfect secrecy that is based on indistinguishability u require that target messages m0, m1 are chosen by a PPT attacker
u require that no such attacker can distinguish Enck(m0) from Enck(m1)
3) indistinguishability
For every A, it holds that Pr[b’=b]=1/2 +negligible
m0, m1 cb
b’
non-negligibly better than guessing
PPT
T DK → k
{0, 1} → b Enck(mb) → cb
A
PPT
something that can be safely ignored
43

Computational CPA-security
Advanced security implies probabilistic encryption – why?
Strengthen the definition of computational plain-security
u allow attacker to have access to an encryption “box”
u allow the attacker to select m0, m1 after using this “box” (as many times as desired)
T DK → k
{0, 1} → b Enck(mb) → cb
A
Enc(k,)
3) indistinguishability
For every PPT A, it holds that
Pr[ b’ = b ] = 1/2 + negligible
PPT
mi ci
m0, m1 cb
b’
PPT
something that can be safely ignored
44

45
4.4 Symmetric encryption, revisited: OTP with pseudorandomness

Perfect secrecy & randomness
Role of randomness in encryption is integral
u in a perfectly secret cipher, the ciphertext doesn’t depend on the message
u the ciphertext appears to be truly random
u the uniform key-selection distribution is imposed also onto produced ciphertexts
u e.g., c = k XOR m (for uniform k and any distribution over m)
When security is computational, randomness is relaxed to “pseudorandomness”
u the ciphertext appears to be “pseudorandom”
u it cannot be efficiently distinguished from truly random
46

Symmetric encryption as “OPT with pseudorandomness”
Stream cipher
Uses a short key to encrypt long symbol
streams into a pseudorandom ciphertext u based on abstract crypto primitive of
pseudorandom generator (PRG)
Block cipher
Uses a short key to encrypt blocks of symbols
into pseudorandom ciphertext blocks
u based on abstract crypto primitive of
… RESTUOKD
Plaintext
state
… rrywytovty
Ciphertext
key
Encryption
Encryption
pseudorandom function (PRF) key
(next block) STU
(block) OKD
Plaintext
tty
Ciphertext
47

48
4.4.1 Pseudorandom generators

Stream ciphers
key
… RESTUOKD state
Plaintext Encryption
… rrywytovty
Ciphertext
49

Pseudorandom generators (PRGs)
Deterministic algorithm G that
on input a seed s∈{0,1}t, outputs G(s)∈{0,1}l(t)
G is a PRG if:
u expansion
u for polynomial l, it holds that for any n, l(n) > n
u models the process of extracting randomness from a short random string
u pseudorandomness
u no efficient statistical test can tell apart G(s) from a truly random string 50
PRG G
a.k.a. stream cipher
n
l(n)
s
G(s)

Generic PRG-based symmetric encryption
u Fixed-length message encryption
51
encryption scheme is plain-secure as long as the underlying PRG is secure

Generic PRG-based symmetric encryption (cont.)
u Bounded- or arbitrary-length message encryption
u specified by a mode of operation for using an underlying stateful stream cipher,
repeatedly, to encrypt/decrypt a stream of symbols
52

Stream ciphers: Modes of operations
u Bounded or arbitrary-length message encryption
on-the-fly computation of new pseudorandom bits, no IV needed, plain-secure
random IV used for every new message is sent along with ciphertext, advanced-secure
53