程序代写代做代考 algorithm Java Stack of Stack Frames

Stack of Stack Frames

Review
Thinking about Algorithms Abstractly
Jeff Edmonds
York University

COSC 3101

‹#›
1

Some Math
Recurrence Relations
T(n) = a T(n/b) + f(n)

Input Size
Time
Classifying Functions
f(i) = nQ(n)
Adding Made Easy
∑i=1 f(i).

Time Complexity

t(n) = Q(n2)
Logic Quantifiers
g “b Loves(b,g)
“b g Loves(b,g)
Logs and Exps
2a × 2b = 2a+b
2log n = n

‹#›
2

Understand Quantifiers!!!
Ann
Fred
Marilin Monro
John
Beth
Bob
Mary
Sam

We read this left to right.
Proof:
$g, “b, Loves(b, g)

I produce the object when it is a $.

I produce the object when it is a “.

I can always win
if and only if
statement is true.

‹#›
3

Understand Quantifiers!!!

Let y = 5-x.
Proof:
“x, $y, x+y=5

Let x be an arbitrary real number.
The relation is true.
x+y = x + (5-x) = 5
I can always win.
Hence, the statement is true.

‹#›
4

Problem P is computable in polynomial time.
 A, c, n0,”I, A(I)=P(I) & (|I| < n0 or Time(A,I) ≤ |I|c) The Time Complexity ‹#› 5 Two Example Algorithms Sum N array entries: A(1) + A(2) + A(3) + … Factor value N: Input N=5917 & Output N=97*61. Algorithm N/2, N/3, N/4, …. ? size n = size n = Time as function of input size Time for you to solve problem instance as a function of time for me to give you the problem instance. = n = 2n Time = N Time = N N log N ‹#› 6 Adding Made Easy ‹#› 7 Evaluating: T(n) = aT(n/b)+f(n) ‹#› 8 Meta Algorithms I try to provide a unified way to think about each algorithm type. Follow my fixed set of steps. Even if you don’t get the details of the algorithm correct, at least get the right structure. Try to match the parts of the new problem with the parts of the meta algorithm. eg instances, solutions, bird question, … ‹#› 9 Iterative Algorithms & Loop Invariants Jeff Edmonds York University COSC 3101 Lecture 1 Understanding Algorithms Abstractly Home to School Problem Steps in an Iterative Algorithm Assertions The Sum of Objects Insertion Sort Selection Sort Designing an Algorithm Finding Errors Typical Loop Invariants Binary Search Like Examples The Partition Problem Greatest Common Divisor Bucket (Quick) Sort for Humans Radix/Counting Sort Lower Bound for Sort ‹#› 10 Iterative Algorithms Define Problem Define Loop Invariants Define Measure of Progress Define Step Define Exit Condition Maintain Loop Inv Make Progress Initial Conditions Ending 79 km to school Exit Exit 79 km 75 km Exit Exit 0 km Exit ‹#› 11 Partial Correctness Proves that IF the program terminates then it works & Þ

codeA

Establishing Loop Invariant



codeC

Clean up loose ends

Exit


¬
codeB

Maintaining Loop Invariant

Exit

‹#›
12

The Sum Problem
Precondition: The input is an integer I.
Postcondition: The output is S = 12 + 22 + 32 + … + I2
Loop Invariant: s = 12 + 22 + 32 + … + i2 (1 ≤ i ≤ I)
Picture, not an action

¬
codeB

Maintaining Loop Invariant

Exit

Let it and st be the values of i and s at the beginning of the iteration and let it+1 and st+1 be their after going around again.

‹#›
13

79 km

75 km

Exit

Make progress

it+1 = it + 1
st+1 = st + it+12
The Sum Problem
Maintain the loop invariant:

Exit

Precondition: The input is an integer I.
Postcondition: The output is S = 12 + 22 + 32 + … + I2
Loop Invariant: s = 12 + 22 + 32 + … + i2 (1 ≤ i ≤ I)

‹#›
14

No Assumptions:
Suppose a Martian jumped
into the top of the loop.
All you would know is that was true.
It alone must provide enough information
so that, after going around the loop, you can establish that the loop invariant is again true.

‹#›
15

Typical Types of Loop Invariants
Fairy God Mother
I have arrived from Mars, some amount of progress has been made already and my Fairy God Mother has set the state of the computation to be just what I want so that I feel as comfortable as I could be given the amount of progress that has been made.
More of the input
More of the output
Narrowed the search space
GCD Like

‹#›
16

Fairy God Mother
Loop Invariants

My progress is the distance
swum from the center.

What is your wish?
It is my job to
establish
maintain
and get the post condition
from this loop invariant

‹#›
17

Extra
More of the Input
Loop Invariant
The input consists of an array of objects

We have read in the first i objects.
We will pretend that this prefix
is the entire input.
We have a solution for this prefix
plus some additional information

Solution

‹#›
18

More of the Output
Loop Invariant
The output consists of an array of objects

I have produced the first i objects.
Produce the i+1st output object.

Exit

79 km

75 km

Exit

i to i+1
Done when
output n objects.

‹#›
19

Restricted Search Space
Loop Invariant
Maintain a sublist.
If the key is contained in the original list, then the key is contained in the sublist.

key 25

3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95

‹#›
20

For all major algorithms covered. Learn pre and post conditions. Learn the Loop Invariant
Learn how to make progress
while maintaining the LI.
Iterative Algorithms

‹#›
21

Recursion
Jeff Edmonds
York University

COSC 3101
Lecture 3
Thinking about Algorithms Abstractly
Multiplying
Recurrence Relations
Code
Stack of Stack Frames
Tree of Stack Frames
Friends and Strong Induction
Towers of Hanoi
Check List
Merge & Quick Sort
Simple Recursion on Trees
Generalizing the Problem
Things not to do
Heap Sort & Priority Queues
Trees Representing Equations
Pretty Print
Parsing
Recursive Images
Ackermann's Function

‹#›
22

Multiplication of 2 n-bit numbers
X =
Y =

X = a 2n/2 + b Y = c 2n/2 + d

XY = ac 2n + (ad+bc) 2n/2 + bd
a
b
c
d

‹#›
23

X = 2133
Y = 2312
ac = 483
bd = 396
(a+b)(c+d) = 1890
XY = 4931496

X = 21
Y = 23
ac = 4
bd = 3
(a+b)(c+d) = 15
XY = 483

X = 33
Y = 12
ac = 3
bd = 6
(a+b)(c+d) = 18
XY = 396

X = 54
Y = 35
ac = 15
bd = 20
(a+b)(c+d) = 72
XY = 1890

X = 2
Y = 2
XY=4

X = 1
Y = 3
XY=3

X = 3
Y = 5
XY=15

X = 3
Y = 2
XY=6

X = 6
Y = 3
XY=18

X = 5
Y = 3
XY=15

X = 4
Y = 5
XY=20

X = 9
Y = 8
XY=72
Friends & Strong Induction

Worry about one step at a time.
Imagine that you are one specific friend.

MULT(X,Y):
If |X| = |Y| = 1 then RETURN XY
Break X into a,b and Y into c,d
e = MULT(a,c) and f =MULT(b,d)
RETURN
e 10n + (MULT(a+b, c+d) – e - f) 10n/2 + f

X = 3
Y = 1
XY=3

‹#›
24

X = 33
Y = 12
ac = 3
bd = 6
(a+b)(c+d) = 18
XY = 396
Friends & Strong Induction
Consider your input instance
Allocate work
Construct one or more sub-instances
Assume by magic your friends give you the answer for these.
Use this help to solve your own instance.
Do not worry about anything else.
Who your boss is…
How your friends solve their instance…

X = 3
Y = 1
XY = 3
X = 3
Y = 2
XY = 6
X = 6
Y = 3
XY = 18

MULT(X,Y):
If |X| = |Y| = 1 then RETURN XY
Break X into a,b and Y into c,d
e = MULT(a,c) and f =MULT(b,d)
RETURN
e 10n + (MULT(a+b, c+d) – e - f) 10n/2 + f

‹#›
25

Induction Hypothesis:
``The recursive algorithm works
for every instance of size n.''
``The algorithm works for every instance of any size.''

Base case
size i

Friends & Strong Induction

‹#›
26

Friends & Strong Induction

Give and solve the recurrence relation
for the Running Time of this algorithm.

T(n) = aT(n/b)+f(n)

‹#›
27

Check Lists for Recursive Programs
This is the format of “all” recursive programs.

‹#›
28

Recursion on Trees

3

8

1

3

2

2

7

6

5

9

4

1

Being lazy, I will only consider
my root node and my communication with my friends.
I will never look at my children subtrees
but will trust them to my friends.

6

‹#›
29

Recursion on Trees

3

8

1

3

2

2

7

6

5

9

4

1

I know only what you tell me
via the pre-condition.
I tell you only what the post-condition
instructs me to.

‹#›
30

Searching in a Graph
Jeff Edmonds
York University

COSC 3101
Lecture 5
Thinking about Algorithms Abstractly

Generic Search
Breadth First Search
Dijkstra's Shortest Paths Algorithm
Depth First Search
Linear Order

‹#›
31

s
a
c
h
k
f
i
m
j
e
b
g
d

We know found nodes are reachable from s because we have traced out a path.
If a node has been handled,
then all of its neighbors
have been found.
Graph Search

l

‹#›
32

Graph Search
Which foundNotHandled node do we handle?

Queue: Handle in order found.
Breadth-First Search
Stack: Handle most recently found
Depth-First Search
Priority Queue:
Handle node that seems to be closest to s.
Shortest (Weighted) Paths:

‹#›
33

Dijkstra's
Handled Nodes

u

v

s

Found Nodes
Handled paths go through handled edges
through any number of handled nodes
followed by last edge to an unhandled node.

For handled w,
d(w) is the length of the
shortest paths to w.
Handle node with smallest d(u).

d(v) is the length
of the shortest handled
path to v.

‹#›
34

Dijkstra's

0
13
16
17
20

Handle c

19
19

‹#›
35

DFS

s
a
c
h
k
f
i
l
m
j
e
b
g
d

s,1

Found
Not Handled
Stack

a,1
c,2

i,0

‹#›
36

Linear Order
a

b
h

c
i

d
j

e
k
f
g

Found
Not Handled
Stack
Alg: DFS

b
l

When take off stack
add to backwards order
c
i,j,k,d,e,g,l,f

‹#›
37

Network Flow &
Linear Programming
Jeff Edmonds York University

COSC 3101
Lecture 5
Thinking about Algorithms Abstractly
Optimization Problems
Network Flow Defn
Simple Example
Application: Matching
Flow Strategy
Hill Climbing
Augmenting Flow
Primal-Dual Hill Climbing
Min Cut
Max Flow = Min Cut
Running Time
Linear Programming

‹#›
38

Ingredients:
Instances: The possible inputs to the problem.
Solutions for Instance: Each instance has an exponentially large set of solutions.
Cost of Solution: Each solution has an easy to compute cost or value.
Specification
Preconditions: The input is one instance.
Postconditions: An valid solution with optimal cost. (minimum or maximum)
Optimization Problems

‹#›
39

Instance:
A Network is a directed graph G
Edges represent pipes that carry flow
Each edge has a maximum capacity c
A source node s out of which flow leaves
A sink node t into which flow arrives
Network Flow

‹#›
40

Solution:
The amount of flow F through each edge.
Flow F cant exceed capacity c.
No leaks, no extra flow.
For each node v: flow in = flow out
u F = w F

Network Flow

‹#›
41

Value of Solution:
Flow from s into the network
minus flow from the network back into s.
rate(F) = u F

- v F

Goal:
Max Flow
Network Flow

‹#›
42

Value Solution C=:
cap(C) = how much can flow from U to V
= uU,vV c

Min Cut
s
t

U
V

u
v
Goal:
Min Cut

‹#›
43

u
v
f/c
0/c
Flow Graph

u
v
Augmentation Graph

f+w

w
Walking

c-F
F+c
c
F
c

Network Flow

‹#›
44

u
v
F/c
0/c
Flow Graph

u
v
Augmentation Graph

F-w

c-F
F+c
c
F
c

w
Walking

Network Flow

‹#›
45

Theorem:
For all Networks MaxF rate(F) = MinC cap(C)
Prove:  F,C rate(F)  cap(C)
Max Flow = Min Cut

U
V

u
v
Prove:  flow F, alg either
finds a better flow F
or finds cut C such that rate(F) = cap(C)
Alg stops with an F and C for which rate(F) = cap(C)
F witnesses that the optimal flow cant be less
C witnesses that it cant be more.

‹#›
46

Linear Programming
Cost: 29, 8, 1, 2
Amount to add: x1, x2, x3, x4
pork
grain
water
sawdust
3x1 + 4x2 – 7x3 + 8x4 ³ 12
2x1 - 8x2 + 4x3 - 3x4 ³ 24
-8x1 + 2x2 – 3x3 - 9x4 ³ 8
x1 + 2x2 + 9x3 - 3x4 ³ 31
Constraints:
moisture
protean,

29x1 + 8x2 + 1x3 + 2x4
Cost of Hotdog:

‹#›
47

Greedy Algorithms
Jeff Edmonds York University
COSC 3101
Lecture 6
Thinking about Algorithms Abstractly
Greedy Algorithm for Optimization Problems
Proving with Loop Invariants
Three Players making Change
Review and Don'ts
Event Scheduling
Minimum Spanning Tree
Adaptive Greedy
Interval Point Cover
Forced Horn Clauses

‹#›
48

Optimization Problems
with a Greedy Algorithm

Instances: A set of objects
and a relationship between them.
Solutions for Instance: A subset of the objects.
Or some other choice about each object.

Cost of Solution: The number of objects in solution or the sum of the costs of objects

‹#›
49

Fixed vs Adaptive Priority
Fixed Priority:
Sort the objects from best to worst
and loop through them.

Adaptive Priority:
Greedy criteria depends on which objects have been committed to so far.
At each step, the next “best’’ object is chosen according to the current greedy criteria.
Searching or re-sorting takes too much time.
Use a priority queue.

‹#›
50

Fixed vs Adaptive Priority

‹#›
51

Designing the Loop Invariant

At:
Let At denote the decisions made by the algorithm
during the first t time steps.
St:
A solution must make a decision for every object.

St extends At:
If At makes a decision about an object,
then St makes the same decision.

‹#›
52

Designing the Loop Invariant

At:
Let At denote the decisions made by the algorithm
during the first t time steps.
St:
A solution must make a decision for every object.

St extends At:
Even though the Algorithm has committed to At,
it may still produce St.

‹#›
53

Designing the Loop Invariant

At:
Let At denote the decisions made by the algorithm
during the first t time steps.
St:
A solution must make a decision for every object.

We have not gone wrong.
There is at least one optimal solution St that extends the choices At made so far.
Loop Invariant

‹#›
54

Designing the Loop Invariant

At+1:
In the t+1st time steps, the algorithm decides
about the next “best” object.
St:
A solution must make a decision for every object.

St extends At+1:

The Algorithm can no longer produce this solution.

This “bridge” has been burned.

‹#›
55

Designing the Loop Invariant

At+1:
In the t+1st time steps, the algorithm decides
about the next “best” object.
St:
A solution must make a decision for every object.

St+1 extends At+1:

Made consistent with
Algorithm’s tth decision
Other decisions
may change

‹#›
56

Designing the Loop Invariant

At+1:
In the t+1st time steps, the algorithm decides
about the next “best” object.

St+1 extends At+1:

We have not gone wrong.
There is at least one optimal solution St+1 that extends the choices At+1 made so far.
Loop Invariant

‹#›
57

Changing St-1 into St

‹#›
58

In order to make progress,
we may head in one direction.

79 km

75 km

Exit

My little brother made a decision about Objt.
I must first tell the Fairy Godmother to make this same decision about Objt.
S’t = St-1 + At’s decision made about Objt.
Changing St-1 into St

‹#›
59

In order to maintain the loop invariant,
we must adjust to come back to the path.

Exit

Changing St-1 into St

Dude! My solution St-1 had been a valid optimal solution.
and you messed it up.
Ok, I will try and fix it.

‹#›
60

Changing St-1 into St
In order to maintain the loop invariant,
we must adjust to come back to the path.

Exit

I will find some object Obj’ in St-1
(but not in At) that ”conflicts” with this changed decision about Objt and describe
a way of changing the decision made by the fairy godmother about Obj’ so that the solution St is made to be again valid.

‹#›
61

Changing St-1 into St
In order to maintain the loop invariant,
we must adjust to come back to the path.

Exit

St = St-1
+ At’s decision made about Objt
+ the prover’s new decision about Obj’.

‹#›
62

St-1 extended previous choices At-1
and we made it extend the new choices At.
St
Proving St extends At

Proving St is optimal
St-1 was optimal and
St is not worse
Prove St is valid
St-1 was valid
and we introduced no new conflicts.

Changing St-1 into St


¬
codeB

Exit

Maintaining Loop Invariant

‹#›
63

Proof Greedy Algorithm Works
Looks hard,
but they all have the same structure.
Take my proof structure.
Change what the instances, solutions, & costs are.
Change what the greedy criteria is.
Change the change step.
Change the proof that the changed solution is
valid
consistent
optimal
The rest is the same.

‹#›
64

Optimization Problems
A Sequence of Decisions
The Little Bird & Friend
Optimal Substructure
Memoization
Set of Sub-Instances
Tracing Dyn. Prog. Alg
Reversing
Code
Speeding Up Running Time
Multiple Opt Solutions
Review
Question for Little Bird
Review & Don'ts
Best Path
Printing Neatly
Longest Common Subsequence
Knapsack Problem
The Event Scheduling Problem
Parsing
Satisfiability
Techniques
Problems
Recursive Back Tracking &
Dynamic Programming

‹#›
65

Designing Recursive Back Tracking Algorithm
What are instances, solutions, and costs?
Given an instance I,
What question do you ask the little bird?
Given a bird answer k  [K],
What instance subI do your give your friend?
Assume he gives you optSubSol for subI.
How do you produce an optSol for I from
the bird’s k and
the friend’s optSubSol?
How do you determine the cost of optSol from
the bird’s k and
the cost of the friend’s optSubSol?
Try all bird’s answers and take best of best.

Review

‹#›
66

Bird/Friend - Best of the Best
A sequence of question to
a little bird about a solution
forms a tree of possible
answers.
Consider our instance I.
Consider the set of solutions

‹#›
67

Bird/Friend - Best of the Best
But we only care about
the first bird answer.
Consider our instance I.
Consider the set of solutions

The answers classifies
the possible solutions.

Solutions consistent
with the kth bird' s answer.
k
k

‹#›
68

Bird/Friend - Best of the Best
Consider our instance I.
Consider the set of solutions

Solutions consistent
with the kth bird' s answer.
k
Define optS to be:
the optimum solution
for instance I
consistent with the kth bird' s answer.
Do this for each k.

optS

‹#›
69

Bird/Friend - Best of the Best
Consider our instance I.
Consider the set of solutions

Let kmax be the bird' s answer
giving the best optS.
Define optS to be:
the optimum solution
for instance I
consistent with the kth bird' s answer.
Do this for each k.

k

optS
kmax
optS[I] = optS = Bestk optS
max

optS[I]

‹#›
70

Bird/Friend - Best of the Best
Constructing optS :
the optimum solution
for instance I
consistent with the kth bird' s answer.

Given my instance I.

I ask my little bird for
an answer k.
I ask my friend for his solution.
I combine them.

‹#›
71

Recursive Back Tracking Algorithm

Dynamic Programming Algorithm
Given an instance I,
Imagine running the recursive alg on it.
Determine the complete set of subI
ever given to you, your friends, their friends, …
Build a table indexed by these subI
Fill in the table in order so that nobody waits.
the cost of its optimal solution
advice given by the bird
Run the recursive alg with bird’s advice to find
the solution to your instance.
Review

‹#›
72

Dynamic Programs
do not recurse making
the instance smaller
and smaller.
Instead, it up front
determines the set S
of all sub-instances
that ever need
to be solved.
And solves them
in an order so that
nobody waits.

‹#›
73

Reductions &
NP-Completeness
Jeff Edmonds
York University

Classes of Problems
Non-Deterministic Polynomial Time
NP vs Other Classes
NP-Complete
Reductions

Thinking about Algorithms Abstractly
Intro & COSC 3101/4111/6111
Definitions Needed
For 3101

‹#›

CoNP
not SAT

Computable

Exp

Poly

NP
GCD
SAT
Halting

Acceptable

CoAcceptable
Not Halting
Classes of Problems

‹#›

75

Poly
GCD
Poly:
Problems P computable in polynomial.
 A  I A(I) = P(I) and Time(A,I)  |I|c
Polynomial Time

‹#›

76

Each instance I
has an exponential number of possible solutions.
Each solution S is either valid or not
(sufficiently high value or not)
The output is
Yes, it has a valid solution.
No, it does not
the solution need not be returned
There is a poly-time alg Valid(I,S) to test
given an instance I and a solution S,
whether or not S is a valid solution of I.
Eg: Network Flows
Given I = ,
does network G have flow of value  k?

NP
Non-Deterministic Polynomial Time

Poly
Network Flows

‹#›

NP
Non-Deterministic Polynomial Time
Example: Satisfiablity:
Instance: A circuit I.
Solution: A satisfying assignment S.
I is a yes instance iff there is such an assignment .
Given an instance I and a solution S,
there is a poly-time alg Valid(I,S) to test
whether or not S is a valid solution of I.
0 1 1 0 1 0 1
Assignment S:

1
Sat(I)=Yes iff  S Valid(I,S)
Circuit I:

SAT

Poly

‹#›

NP
Formal definition:
Prob  NP iff  poly time Valid
such that Prob(I) = Yes iff  S Valid(I,S)
Non-Deterministic Polynomial Time
SAT

Poly
Sat(I)=Yes iff  S Valid(I,S)

‹#›

79

Non-Deterministic Polynomial Time
Sat(I)=Yes iff  S Valid(I,S)
A Non-Deterministic TM/Java/… is
the same as a deterministic one,
except that its moves are not deterministic.
It has a “choice” as to which step to take next.
Deterministic:  (qi,c) =
Non-Deterministic:  ϵ 
A non-deterministic machine M on input I
is said to accept its input
if there exists an accepting computation.
Note problems computed need yes/no answers.

‹#›

80

Poly
NP:

NP
If I is a “yes” instance:
A fairy god mother could provide you a small witness/solution/proof S with which you can prove to your non-believing poly time boss that I is a “yes” instance.
If I is a “no” instance:
There is likely no such witness/solution/proof .

Non-Deterministic Polynomial Time
SAT

‹#›

81

Poly
CoNP:

NP
If I is a “No” instance:
A fairy god mother could provide you a small witness/solution/proof S with which you can prove to your non-believing poly time boss that I is a “no” instance.
If I is a “Yes” instance:
There is likely no such witness/solution/proof .

Non-Deterministic Polynomial Time

CoNP
not SAT
SAT

‹#›

82

Poly

NP
SAT

CoNP
Network Flows
Network Flows:
Is there a flow in G with value  k?
A witness/solution/proof S of Yes is
a flow
A witness/solution/proof S of No is
a Cut.
not SAT

Network Flows
Network Flows
Non-Deterministic Polynomial Time

‹#›

Poly
GCD

NP

NP vs Other Classes
If P Polynomial Time,
You don’t even need
the fairy good mother.
 P NP
If P NP,
then |S|  n1000
In time you can
try all witnesses S.
 P Exp

Exp
n1000
2
SAT
Games

‹#›

84

NP

Exp

Poly

Computable
GCD
SAT
Halting

Turing 1936
Games
NP vs Other Classes

‹#›

NP

Exp

Poly

Computable

Acceptable
GCD
SAT
If I= is a “yes” instance
M halts on I
the computation is a witness
arbitrarily long but finite.
If I is a “no” instance
M runs forever
there is no witness
Halting

CoAcceptable
Not Halting

CoNP
not SAT
Games
NP vs Other Classes

‹#›

Computable

Exp

Poly

Known
GCD

NP
NP-Complete Problems
Problem Pnew  NP
Pnew not too hard.
Pnew
Test in poly-time
if a given solution
is valid

‹#›

Computable

Exp

Poly

Known
GCD

NP
NP-Complete Problems
Pnew
Sat
Problem Pnew is NP-Hard
If being able to solve
this problem fast
means that you can
solve every problem
in the class fast.
PNP, P ≤poly Pnew
Easier: Sat ≤poly Pnew
Cook: P ≤poly Sat

Steve Cook

P

‹#›

88

Computable

Exp

Poly

Known
GCD

NP
NP-Complete Problems
Problem Pnew is NP-Complete
Pnew  NP
Pnew is NP-Hard

Pnew

‹#›

Computable

Exp

Poly

Known
GCD

NP

complete
NP-Complete Problems
Problem Pnew is NP-Complete
Pnew  NP
Pnew is NP-Hard

Pnew
Sat

Steve Cook

‹#›

Computable

Exp
Poly

Known

NP

complete
GCD
Matching
SAT

CoNP

complete
not SAT
Halting
Games
Most “natural” problems in NP
are either
in PolyTime
or NP-Complete
except?

NP-Complete Problems

‹#›

Computable

Exp
Poly

Known

NP

complete
GCD
Matching
SAT

CoNP

complete
not SAT
Factoring
Graph
Isomorphism
Halting
Games
Most “natural” problems in NP
are either
in PolyTime
or NP-Complete
except
NP-Complete Problems

‹#›
Reductions
Reduction: Design a fast algorithm for one computational problem, using a supposedly fast algorithm for another problem as a subroutine.
Palg ≤poly Poracle

‹#›
Reductions
Reduction: Design a fast algorithm for one computational problem, using a supposedly fast algorithm for another problem as a subroutine.
Use to create a new algorithm from a known algorithm.
Learn that a new problem is likely hard,
from knowing a known problem is likely hard.
Learn that two problems have a similar “structure.”

Palg ≤poly Poracle

‹#›

GIVEN:
Network
Flow Oracle

BUILD:
Matching
Oracle
Matching ≤poly Network Flows
Who loves who

Ann
Fred
Sue
John
Beth
Bob
Mary
Sam

Max matching

A network

s

t

Max Flow

s
t

Reductions

‹#›

Exp

Poly

Steve Cook 1971

NP
SAT

complete
poly Prob’
Matching ≤poly Network Flows
Learn that a new problem
is likely hard, from knowing
a known problem is likely hard.
Reductions

‹#›
Is there a fast algorithm for Poracle?
Is there a fast algorithm for Palg?
?
We give a fast algorithm for Palg using a
supposed fast algorithm for Poracle as a subroutine.
Reductions

‹#›
If there is a fast
algorithm for Palg
then there is a fast
algorithm for Palg

then there is not fast
algorithm for Poracle
If there is not a fast
algorithm for Palg

If there is a fast
algorithm for Poracle

If there is not a fast
algorithm for Poracle

?
?

We give a fast algorithm for Palg using a
supposed fast algorithm for Poracle as a subroutine.
?
?
??
??
Reductions

‹#›
Notation: Palg ≤poly Poracle
Palg is “at least as easy as” Poracle
(Modulo polynomial terms.)
Poracle is “at least as hard as” Palg
(Modulo polynomial terms.)
Conclusions:
We give a fast algorithm for Palg using a
supposed fast algorithm for Poracle as a subroutine.
Reductions

‹#›
Twas a pleasure
Be Well
Thinking about Algorithms Abstractly
Jeff Edmonds
York University

COSC 3101

‹#›
100

km
¥

Þ
Þ
"
-
Þ
ü
ý
ï
ï
þ
ï
ï
Þ
"
Þ
S
i
S
S
S
S
i
S
i
nS
n
(
)
,
[
(
),
(
),
(
),
.
.
.
,
(
)
(
)]
(
)
0
0
1
2
1

S
n
(
)
=

/docProps/thumbnail.jpeg