Stack of Stack Frames
Review
Thinking about Algorithms Abstractly
Jeff Edmonds
York University
COSC 3101
‹#›
1
Some Math
Recurrence Relations
T(n) = a T(n/b) + f(n)
Input Size
Time
Classifying Functions
f(i) = nQ(n)
Adding Made Easy
∑i=1 f(i).
Time Complexity
t(n) = Q(n2)
Logic Quantifiers
g “b Loves(b,g)
“b g Loves(b,g)
Logs and Exps
2a × 2b = 2a+b
2log n = n
‹#›
2
Understand Quantifiers!!!
Ann
Fred
Marilin Monro
John
Beth
Bob
Mary
Sam
We read this left to right.
Proof:
$g, “b, Loves(b, g)
I produce the object when it is a $.
I produce the object when it is a “.
I can always win
if and only if
statement is true.
‹#›
3
Understand Quantifiers!!!
Let y = 5-x.
Proof:
“x, $y, x+y=5
Let x be an arbitrary real number.
The relation is true.
x+y = x + (5-x) = 5
I can always win.
Hence, the statement is true.
‹#›
4
Problem P is computable in polynomial time.
A, c, n0,”I, A(I)=P(I) & (|I| < n0 or Time(A,I) ≤ |I|c)
The Time Complexity
‹#›
5
Two Example Algorithms
Sum N array entries:
A(1) + A(2) + A(3) + …
Factor value N:
Input N=5917 & Output N=97*61.
Algorithm N/2, N/3, N/4, …. ?
size n =
size n =
Time as function of input size
Time for you to solve problem instance
as a function of
time for me to give you the problem instance.
= n
= 2n
Time = N
Time = N
N
log N
‹#›
6
Adding Made Easy
‹#›
7
Evaluating: T(n) = aT(n/b)+f(n)
‹#›
8
Meta Algorithms
I try to provide a unified way to think
about each algorithm type.
Follow my fixed set of steps.
Even if you don’t get the details of the
algorithm correct, at least get the right structure.
Try to match the parts of the new problem
with the parts of the meta algorithm.
eg instances, solutions, bird question, …
‹#›
9
Iterative Algorithms
& Loop Invariants
Jeff Edmonds
York University
COSC 3101
Lecture 1
Understanding Algorithms Abstractly
Home to School Problem
Steps in an Iterative Algorithm
Assertions
The Sum of Objects
Insertion Sort
Selection Sort
Designing an Algorithm
Finding Errors
Typical Loop Invariants
Binary Search Like Examples
The Partition Problem
Greatest Common Divisor
Bucket (Quick) Sort for Humans
Radix/Counting Sort
Lower Bound for Sort
‹#›
10
Iterative Algorithms
Define Problem Define Loop Invariants Define Measure of Progress
Define Step Define Exit Condition Maintain Loop Inv
Make Progress Initial Conditions Ending
79 km
to school
Exit
Exit
79 km
75 km
Exit
Exit
0 km
Exit
‹#›
11
Partial Correctness
Proves that IF the program terminates then it works
Þ
Establishing Loop Invariant
Clean up loose ends Exit Maintaining Loop Invariant Exit ‹#› The Sum Problem Maintaining Loop Invariant Exit Let it and st be the values of i and s at the beginning of the iteration and let it+1 and st+1 be their after going around again. ‹#› 79 km 75 km Exit Make progress it+1 = it + 1 Exit Precondition: The input is an integer I. ‹#› No Assumptions: ‹#› Typical Types of Loop Invariants ‹#› Fairy God Mother My progress is the distance What is your wish? ‹#› Extra We have read in the first i objects. Solution ‹#› More of the Output I have produced the first i objects. Exit 79 km 75 km Exit i to i+1 ‹#› Restricted Search Space key 25 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 ‹#› For all major algorithms covered. Learn pre and post conditions. Learn the Loop Invariant ‹#› Recursion COSC 3101 ‹#› Multiplication of 2 n-bit numbers X = a 2n/2 + b Y = c 2n/2 + d XY = ac 2n + (ad+bc) 2n/2 + bd ‹#› X = 2133 X = 21 X = 33 X = 54 X = 2 X = 1 X = 3 X = 3 X = 6 X = 5 X = 4 X = 9 Worry about one step at a time. MULT(X,Y): X = 3 ‹#› X = 33 X = 3 MULT(X,Y): ‹#› Induction Hypothesis: Base case Friends & Strong Induction ‹#› Friends & Strong Induction Give and solve the recurrence relation T(n) = aT(n/b)+f(n) ‹#› Check Lists for Recursive Programs ‹#› Recursion on Trees 3 8 1 3 2 2 7 6 5 9 4 1 Being lazy, I will only consider 6 ‹#› Recursion on Trees 3 8 1 3 2 2 7 6 5 9 4 1 I know only what you tell me ‹#› Searching in a Graph COSC 3101 Generic Search ‹#› s We know found nodes are reachable from s because we have traced out a path. l ‹#› Graph Search Queue: Handle in order found. ‹#› Dijkstra's u v s Found Nodes For handled w, d(v) is the length ‹#› Dijkstra's 0 19 ‹#› DFS s s,1 Found a,1 i,0 ‹#› Linear Order b c d e Found b When take off stack ‹#› Network Flow & COSC 3101 ‹#› Ingredients: ‹#› Instance: ‹#› Solution: Network Flow ‹#› Value of Solution: - v F Goal: ‹#› Value Solution C=: Min Cut U u ‹#› u u f+w w c-F Network Flow ‹#› u u F-w c-F w Network Flow ‹#› Theorem: U u ‹#› Linear Programming ‹#› Greedy Algorithms ‹#› Optimization Problems Instances: A set of objects Cost of Solution: The number of objects in solution or the sum of the costs of objects ‹#› Fixed vs Adaptive Priority Adaptive Priority: ‹#› Fixed vs Adaptive Priority ‹#› Designing the Loop Invariant At: St extends At: ‹#› Designing the Loop Invariant At: St extends At: ‹#› Designing the Loop Invariant At: We have not gone wrong. ‹#› Designing the Loop Invariant At+1: St extends At+1: The Algorithm can no longer produce this solution. This “bridge” has been burned. ‹#› Designing the Loop Invariant At+1: St+1 extends At+1: Made consistent with ‹#› Designing the Loop Invariant At+1: St+1 extends At+1: We have not gone wrong. ‹#› Changing St-1 into St ‹#› In order to make progress, 79 km 75 km Exit My little brother made a decision about Objt. ‹#› In order to maintain the loop invariant, Exit Changing St-1 into St Dude! My solution St-1 had been a valid optimal solution. ‹#› Changing St-1 into St Exit I will find some object Obj’ in St-1 ‹#› Changing St-1 into St Exit St = St-1 ‹#› St-1 extended previous choices At-1 Proving St is optimal Changing St-1 into St Exit Maintaining Loop Invariant ‹#› Proof Greedy Algorithm Works ‹#› Optimization Problems ‹#› Designing Recursive Back Tracking Algorithm Review ‹#› Bird/Friend - Best of the Best ‹#› Bird/Friend - Best of the Best The answers classifies Solutions consistent ‹#› Bird/Friend - Best of the Best Solutions consistent optS ‹#› Bird/Friend - Best of the Best Let kmax be the bird' s answer k optS optS[I] ‹#› Bird/Friend - Best of the Best Given my instance I. I ask my little bird for ‹#› Recursive Back Tracking Algorithm Dynamic Programming Algorithm ‹#› Dynamic Programs ‹#› Reductions & Classes of Problems Thinking about Algorithms Abstractly ‹#› CoNP Computable Exp Poly NP Acceptable CoAcceptable ‹#› 75 Poly ‹#› 76 Each instance I NP Poly ‹#› NP 1 SAT Poly ‹#› NP Poly ‹#› 79 Non-Deterministic Polynomial Time ‹#› 80 Poly NP Non-Deterministic Polynomial Time ‹#› 81 Poly NP Non-Deterministic Polynomial Time CoNP ‹#› 82 Poly NP CoNP Network Flows ‹#› Poly NP NP vs Other Classes Exp ‹#› 84 NP Exp Poly Computable Turing 1936 ‹#› NP Exp Poly Computable Acceptable CoAcceptable CoNP ‹#› Computable Exp Poly Known NP ‹#› Computable Exp Poly Known NP Steve Cook P ‹#› 88 Computable Exp Poly Known NP Pnew ‹#› Computable Exp Poly Known NP complete Pnew Steve Cook ‹#› Computable Exp Known NP complete CoNP complete NP-Complete Problems ‹#› Computable Exp Known NP complete CoNP complete ‹#› ‹#› Palg ≤poly Poracle ‹#› GIVEN: BUILD: Ann Max matching A network s t Max Flow s Reductions ‹#› Exp Poly Steve Cook 1971 NP complete ‹#› ‹#› then there is not fast If there is a fast If there is not a fast ? We give a fast algorithm for Palg using a ‹#› ‹#› COSC 3101 ‹#› km Þ S /docProps/thumbnail.jpeg
codeC
¬
codeB
12
Precondition: The input is an integer I.
Postcondition: The output is S = 12 + 22 + 32 + … + I2
Loop Invariant: s = 12 + 22 + 32 + … + i2 (1 ≤ i ≤ I)
Picture, not an action
¬
codeB
13
st+1 = st + it+12
The Sum Problem
Maintain the loop invariant:
Postcondition: The output is S = 12 + 22 + 32 + … + I2
Loop Invariant: s = 12 + 22 + 32 + … + i2 (1 ≤ i ≤ I)
14
Suppose a Martian jumped
into the top of the loop.
All you would know is that
It alone must provide enough information
so that, after going around the loop, you can establish that the loop invariant is again true.
15
Fairy God Mother
I have arrived from Mars, some amount of progress has been made already and my Fairy God Mother has set the state of the computation to be just what I want so that I feel as comfortable as I could be given the amount of progress that has been made.
More of the input
More of the output
Narrowed the search space
GCD Like
16
Loop Invariants
swum from the center.
It is my job to
establish
maintain
and get the post condition
from this loop invariant
17
More of the Input
Loop Invariant
The input consists of an array of objects
We will pretend that this prefix
is the entire input.
We have a solution for this prefix
plus some additional information
18
Loop Invariant
The output consists of an array of objects
Produce the i+1st output object.
Done when
output n objects.
19
Loop Invariant
Maintain a sublist.
If the key is contained in the original list, then the key is contained in the sublist.
20
Learn how to make progress
while maintaining the LI.
Iterative Algorithms
21
Jeff Edmonds
York University
Lecture 3
Thinking about Algorithms Abstractly
Multiplying
Recurrence Relations
Code
Stack of Stack Frames
Tree of Stack Frames
Friends and Strong Induction
Towers of Hanoi
Check List
Merge & Quick Sort
Simple Recursion on Trees
Generalizing the Problem
Things not to do
Heap Sort & Priority Queues
Trees Representing Equations
Pretty Print
Parsing
Recursive Images
Ackermann's Function
22
X =
Y =
a
b
c
d
23
Y = 2312
ac = 483
bd = 396
(a+b)(c+d) = 1890
XY = 4931496
Y = 23
ac = 4
bd = 3
(a+b)(c+d) = 15
XY = 483
Y = 12
ac = 3
bd = 6
(a+b)(c+d) = 18
XY = 396
Y = 35
ac = 15
bd = 20
(a+b)(c+d) = 72
XY = 1890
Y = 2
XY=4
Y = 3
XY=3
Y = 5
XY=15
Y = 2
XY=6
Y = 3
XY=18
Y = 3
XY=15
Y = 5
XY=20
Y = 8
XY=72
Friends & Strong Induction
Imagine that you are one specific friend.
If |X| = |Y| = 1 then RETURN XY
Break X into a,b and Y into c,d
e = MULT(a,c) and f =MULT(b,d)
RETURN
e 10n + (MULT(a+b, c+d) – e - f) 10n/2 + f
Y = 1
XY=3
24
Y = 12
ac = 3
bd = 6
(a+b)(c+d) = 18
XY = 396
Friends & Strong Induction
Consider your input instance
Allocate work
Construct one or more sub-instances
Assume by magic your friends give you the answer for these.
Use this help to solve your own instance.
Do not worry about anything else.
Who your boss is…
How your friends solve their instance…
Y = 1
XY = 3
X = 3
Y = 2
XY = 6
X = 6
Y = 3
XY = 18
If |X| = |Y| = 1 then RETURN XY
Break X into a,b and Y into c,d
e = MULT(a,c) and f =MULT(b,d)
RETURN
e 10n + (MULT(a+b, c+d) – e - f) 10n/2 + f
25
``The recursive algorithm works
for every instance of size n.''
``The algorithm works for every instance of any size.''
size i
26
for the Running Time of this algorithm.
27
This is the format of “all” recursive programs.
28
my root node and my communication with my friends.
I will never look at my children subtrees
but will trust them to my friends.
29
via the pre-condition.
I tell you only what the post-condition
instructs me to.
30
Jeff Edmonds
York University
Lecture 5
Thinking about Algorithms Abstractly
Breadth First Search
Dijkstra's Shortest Paths Algorithm
Depth First Search
Linear Order
31
a
c
h
k
f
i
m
j
e
b
g
d
If a node has been handled,
then all of its neighbors
have been found.
Graph Search
32
Which foundNotHandled node do we handle?
Breadth-First Search
Stack: Handle most recently found
Depth-First Search
Priority Queue:
Handle node that seems to be closest to s.
Shortest (Weighted) Paths:
33
Handled Nodes
Handled paths go through handled edges
through any number of handled nodes
followed by last edge to an unhandled node.
d(w) is the length of the
shortest paths to w.
Handle node with smallest d(u).
of the shortest handled
path to v.
34
13
16
17
20
∞
Handle c
19
35
a
c
h
k
f
i
l
m
j
e
b
g
d
Not Handled
Stack
c,2
36
a
h
i
j
k
f
g
Not Handled
Stack
Alg: DFS
l
add to backwards order
c
i,j,k,d,e,g,l,f
37
Linear Programming
Jeff Edmonds York University
Lecture 5
Thinking about Algorithms Abstractly
Optimization Problems
Network Flow Defn
Simple Example
Application: Matching
Flow Strategy
Hill Climbing
Augmenting Flow
Primal-Dual Hill Climbing
Min Cut
Max Flow = Min Cut
Running Time
Linear Programming
38
Instances: The possible inputs to the problem.
Solutions for Instance: Each instance has an exponentially large set of solutions.
Cost of Solution: Each solution has an easy to compute cost or value.
Specification
Preconditions: The input is one instance.
Postconditions: An valid solution with optimal cost. (minimum or maximum)
Optimization Problems
39
A Network is a directed graph G
Edges represent pipes that carry flow
Each edge has a maximum capacity c
A source node s out of which flow leaves
A sink node t into which flow arrives
Network Flow
40
The amount of flow F through each edge.
Flow F cant exceed capacity c.
No leaks, no extra flow.
For each node v: flow in = flow out
u F = w F
41
Flow from s into the network
minus flow from the network back into s.
rate(F) = u F
Max Flow
Network Flow
42
cap(C) = how much can flow from U to V
= uU,vV c
s
t
V
v
Goal:
Min Cut
43
v
f/c
0/c
Flow Graph
v
Augmentation Graph
Walking
F+c
c
F
c
44
v
F/c
0/c
Flow Graph
v
Augmentation Graph
F+c
c
F
c
Walking
45
For all Networks MaxF rate(F) = MinC cap(C)
Prove: F,C rate(F) cap(C)
Max Flow = Min Cut
V
v
Prove: flow F, alg either
finds a better flow F
or finds cut C such that rate(F) = cap(C)
Alg stops with an F and C for which rate(F) = cap(C)
F witnesses that the optimal flow cant be less
C witnesses that it cant be more.
46
Cost: 29, 8, 1, 2
Amount to add: x1, x2, x3, x4
pork
grain
water
sawdust
3x1 + 4x2 – 7x3 + 8x4 ³ 12
2x1 - 8x2 + 4x3 - 3x4 ³ 24
-8x1 + 2x2 – 3x3 - 9x4 ³ 8
x1 + 2x2 + 9x3 - 3x4 ³ 31
Constraints:
moisture
protean,
…
29x1 + 8x2 + 1x3 + 2x4
Cost of Hotdog:
47
Jeff Edmonds York University
COSC 3101
Lecture 6
Thinking about Algorithms Abstractly
Greedy Algorithm for Optimization Problems
Proving with Loop Invariants
Three Players making Change
Review and Don'ts
Event Scheduling
Minimum Spanning Tree
Adaptive Greedy
Interval Point Cover
Forced Horn Clauses
48
with a Greedy Algorithm
and a relationship between them.
Solutions for Instance: A subset of the objects.
Or some other choice about each object.
49
Fixed Priority:
Sort the objects from best to worst
and loop through them.
Greedy criteria depends on which objects have been committed to so far.
At each step, the next “best’’ object is chosen according to the current greedy criteria.
Searching or re-sorting takes too much time.
Use a priority queue.
50
51
Let At denote the decisions made by the algorithm
during the first t time steps.
St:
A solution must make a decision for every object.
If At makes a decision about an object,
then St makes the same decision.
52
Let At denote the decisions made by the algorithm
during the first t time steps.
St:
A solution must make a decision for every object.
Even though the Algorithm has committed to At,
it may still produce St.
53
Let At denote the decisions made by the algorithm
during the first t time steps.
St:
A solution must make a decision for every object.
There is at least one optimal solution St that extends the choices At made so far.
Loop Invariant
54
In the t+1st time steps, the algorithm decides
about the next “best” object.
St:
A solution must make a decision for every object.
55
In the t+1st time steps, the algorithm decides
about the next “best” object.
St:
A solution must make a decision for every object.
Algorithm’s tth decision
Other decisions
may change
56
In the t+1st time steps, the algorithm decides
about the next “best” object.
There is at least one optimal solution St+1 that extends the choices At+1 made so far.
Loop Invariant
57
58
we may head in one direction.
I must first tell the Fairy Godmother to make this same decision about Objt.
S’t = St-1 + At’s decision made about Objt.
Changing St-1 into St
59
we must adjust to come back to the path.
and you messed it up.
Ok, I will try and fix it.
60
In order to maintain the loop invariant,
we must adjust to come back to the path.
(but not in At) that ”conflicts” with this changed decision about Objt and describe
a way of changing the decision made by the fairy godmother about Obj’ so that the solution St is made to be again valid.
61
In order to maintain the loop invariant,
we must adjust to come back to the path.
+ At’s decision made about Objt
+ the prover’s new decision about Obj’.
62
and we made it extend the new choices At.
St
Proving St extends At
St-1 was optimal and
St is not worse
Prove St is valid
St-1 was valid
and we introduced no new conflicts.
¬
codeB
63
Looks hard,
but they all have the same structure.
Take my proof structure.
Change what the instances, solutions, & costs are.
Change what the greedy criteria is.
Change the change step.
Change the proof that the changed solution is
valid
consistent
optimal
The rest is the same.
64
A Sequence of Decisions
The Little Bird & Friend
Optimal Substructure
Memoization
Set of Sub-Instances
Tracing Dyn. Prog. Alg
Reversing
Code
Speeding Up Running Time
Multiple Opt Solutions
Review
Question for Little Bird
Review & Don'ts
Best Path
Printing Neatly
Longest Common Subsequence
Knapsack Problem
The Event Scheduling Problem
Parsing
Satisfiability
Techniques
Problems
Recursive Back Tracking &
Dynamic Programming
65
What are instances, solutions, and costs?
Given an instance I,
What question do you ask the little bird?
Given a bird answer k [K],
What instance subI do your give your friend?
Assume he gives you optSubSol for subI.
How do you produce an optSol for I from
the bird’s k and
the friend’s optSubSol?
How do you determine the cost of optSol from
the bird’s k and
the cost of the friend’s optSubSol?
Try all bird’s answers and take best of best.
66
A sequence of question to
a little bird about a solution
forms a tree of possible
answers.
Consider our instance I.
Consider the set of solutions
67
But we only care about
the first bird answer.
Consider our instance I.
Consider the set of solutions
the possible solutions.
with the kth bird' s answer.
k
k
68
Consider our instance I.
Consider the set of solutions
with the kth bird' s answer.
k
Define optS to be:
the optimum solution
for instance I
consistent with the kth bird' s answer.
Do this for each k.
69
Consider our instance I.
Consider the set of solutions
giving the best optS.
Define optS to be:
the optimum solution
for instance I
consistent with the kth bird' s answer.
Do this for each k.
kmax
optS[I] = optS = Bestk optS
max
70
Constructing optS :
the optimum solution
for instance I
consistent with the kth bird' s answer.
an answer k.
I ask my friend for his solution.
I combine them.
71
Given an instance I,
Imagine running the recursive alg on it.
Determine the complete set of subI
ever given to you, your friends, their friends, …
Build a table indexed by these subI
Fill in the table in order so that nobody waits.
the cost of its optimal solution
advice given by the bird
Run the recursive alg with bird’s advice to find
the solution to your instance.
Review
72
do not recurse making
the instance smaller
and smaller.
Instead, it up front
determines the set S
of all sub-instances
that ever need
to be solved.
And solves them
in an order so that
nobody waits.
73
NP-Completeness
Jeff Edmonds
York University
Non-Deterministic Polynomial Time
NP vs Other Classes
NP-Complete
Reductions
Intro & COSC 3101/4111/6111
Definitions Needed
For 3101
not SAT
GCD
SAT
Halting
Not Halting
Classes of Problems
GCD
Poly:
Problems P computable in polynomial.
A I A(I) = P(I) and Time(A,I) |I|c
Polynomial Time
has an exponential number of possible solutions.
Each solution S is either valid or not
(sufficiently high value or not)
The output is
Yes, it has a valid solution.
No, it does not
the solution need not be returned
There is a poly-time alg Valid(I,S) to test
given an instance I and a solution S,
whether or not S is a valid solution of I.
Eg: Network Flows
Given I =
does network G have flow of value k?
Non-Deterministic Polynomial Time
Network Flows
Non-Deterministic Polynomial Time
Example: Satisfiablity:
Instance: A circuit I.
Solution: A satisfying assignment S.
I is a yes instance iff there is such an assignment .
Given an instance I and a solution S,
there is a poly-time alg Valid(I,S) to test
whether or not S is a valid solution of I.
0 1 1 0 1 0 1
Assignment S:
Sat(I)=Yes iff S Valid(I,S)
Circuit I:
Formal definition:
Prob NP iff poly time Valid
such that Prob(I) = Yes iff S Valid(I,S)
Non-Deterministic Polynomial Time
SAT
Sat(I)=Yes iff S Valid(I,S)
Sat(I)=Yes iff S Valid(I,S)
A Non-Deterministic TM/Java/… is
the same as a deterministic one,
except that its moves are not deterministic.
It has a “choice” as to which step to take next.
Deterministic: (qi,c) =
Non-Deterministic:
A non-deterministic machine M on input I
is said to accept its input
if there exists an accepting computation.
Note problems computed need yes/no answers.
NP:
If I is a “yes” instance:
A fairy god mother could provide you a small witness/solution/proof S with which you can prove to your non-believing poly time boss that I is a “yes” instance.
If I is a “no” instance:
There is likely no such witness/solution/proof .
SAT
CoNP:
If I is a “No” instance:
A fairy god mother could provide you a small witness/solution/proof S with which you can prove to your non-believing poly time boss that I is a “no” instance.
If I is a “Yes” instance:
There is likely no such witness/solution/proof .
not SAT
SAT
SAT
Network Flows
Network Flows:
Is there a flow in G with value k?
A witness/solution/proof S of Yes is
a flow
A witness/solution/proof S of No is
a Cut.
not SAT
Network Flows
Non-Deterministic Polynomial Time
GCD
If P Polynomial Time,
You don’t even need
the fairy good mother.
P NP
If P NP,
then |S| n1000
In time you can
try all witnesses S.
P Exp
n1000
2
SAT
Games
GCD
SAT
Halting
Games
NP vs Other Classes
GCD
SAT
If I=
M halts on I
the computation is a witness
arbitrarily long but finite.
If I is a “no” instance
M runs forever
there is no witness
Halting
Not Halting
not SAT
Games
NP vs Other Classes
GCD
NP-Complete Problems
Problem Pnew NP
Pnew not too hard.
Pnew
Test in poly-time
if a given solution
is valid
GCD
NP-Complete Problems
Pnew
Sat
Problem Pnew is NP-Hard
If being able to solve
this problem fast
means that you can
solve every problem
in the class fast.
PNP, P ≤poly Pnew
Easier: Sat ≤poly Pnew
Cook: P ≤poly Sat
GCD
NP-Complete Problems
Problem Pnew is NP-Complete
Pnew NP
Pnew is NP-Hard
GCD
NP-Complete Problems
Problem Pnew is NP-Complete
Pnew NP
Pnew is NP-Hard
Sat
Poly
GCD
Matching
SAT
not SAT
Halting
Games
Most “natural” problems in NP
are either
in PolyTime
or NP-Complete
except?
Poly
GCD
Matching
SAT
not SAT
Factoring
Graph
Isomorphism
Halting
Games
Most “natural” problems in NP
are either
in PolyTime
or NP-Complete
except
NP-Complete Problems
Reductions
Reduction: Design a fast algorithm for one computational problem, using a supposedly fast algorithm for another problem as a subroutine.
Palg ≤poly Poracle
Reductions
Reduction: Design a fast algorithm for one computational problem, using a supposedly fast algorithm for another problem as a subroutine.
Use to create a new algorithm from a known algorithm.
Learn that a new problem is likely hard,
from knowing a known problem is likely hard.
Learn that two problems have a similar “structure.”
Network
Flow Oracle
Matching
Oracle
Matching ≤poly Network Flows
Who loves who
Fred
Sue
John
Beth
Bob
Mary
Sam
t
SAT
poly Prob’
Matching ≤poly Network Flows
Learn that a new problem
is likely hard, from knowing
a known problem is likely hard.
Reductions
Is there a fast algorithm for Poracle?
Is there a fast algorithm for Palg?
?
We give a fast algorithm for Palg using a
supposed fast algorithm for Poracle as a subroutine.
Reductions
If there is a fast
algorithm for Palg
then there is a fast
algorithm for Palg
algorithm for Poracle
If there is not a fast
algorithm for Palg
algorithm for Poracle
algorithm for Poracle
?
supposed fast algorithm for Poracle as a subroutine.
?
?
??
??
Reductions
Notation: Palg ≤poly Poracle
Palg is “at least as easy as” Poracle
(Modulo polynomial terms.)
Poracle is “at least as hard as” Palg
(Modulo polynomial terms.)
Conclusions:
We give a fast algorithm for Palg using a
supposed fast algorithm for Poracle as a subroutine.
Reductions
Twas a pleasure
Be Well
Thinking about Algorithms Abstractly
Jeff Edmonds
York University
100
¥
Þ
"
-
Þ
ü
ý
ï
ï
þ
ï
ï
Þ
"
Þ
S
i
S
S
S
S
i
S
i
nS
n
(
)
,
[
(
),
(
),
(
),
.
.
.
,
(
)
(
)]
(
)
0
0
1
2
1
n
(
)
=