CS代考程序代写 data structure algorithm Lecture 3:

Lecture 3:
Greedy algorithms
The University of Sydney
Page 1

General techniques in this course
– Greedy algorithms [today]
– Divide & Conquer algorithms [21 Mar]
– Geometric algorithms [28 Mar]
– Dynamic programming algorithms [4 and 11 Apr] – Network flow algorithms [18 Apr and 2 May]
The University of Sydney
Page 2

Greedy algorithms
A greedy algorithm is an algorithm that follows the problem solving approach of making a locally optimal choice at each stage with the hope of finding a global optimum.
The University of Sydney
Page 3

Greedy algorithms
Greedy algorithms can be some of the simplest algorithms to implement, but they’re often among the hardest algorithms to design and analyse.
The University of Sydney Page 4

Greedy: Overview
Consider problems that can be solved using a greedy approach:
– Interval scheduling/partitioning – Scheduling to minimize lateness – Shortest path
– Minimum spanning trees
The University of Sydney
Page 5

How to design algorithms
Step 1: Understand problem
Step 4: Better understanding of problem
Step 2: Start with simple alg. Step 5: Improved alg.
Step 3: Does it work? Is it fast?
No Yes
Problem
Algorithm
Analysis
Useful strategy:
– Try some simple examples to get
feel for algorithm.
– If none of them break algorithm,
see if there’s underlying structural property we can use to prove correctness.
The University of Sydney
Page 6
DONE!

Interval Scheduling
The University of Sydney Page 7

Interval Scheduling
– Interval scheduling.
– Input: Set of n jobs. Each job i starts at time sj and finishes at time fj.
– Two jobs are compatible if they don’t overlap in time.
– Goal: find maximum subset of mutually compatible jobs.
b
a
c
d
e
f
g
h
0 1 2 3 4 5 6 7 8 9 10 11 The University of Sydney
Time
Page 8

Interval Scheduling
– Interval scheduling.
– Input: Set of n jobs. Each job i starts at time sj and finishes at time fj.
– Two jobs are compatible if they don’t overlap in time.
– Goal: find maximum subset of mutually compatible jobs.
b
a
c
d
e
f
g
h
0 1 2 3 4 5 6 7 8 9 10 11 The University of Sydney
Time
Page 9

Interval Scheduling: Greedy Algorithms
Greedy template. Consider jobs in some order. Take each job provided it is compatible with the ones already taken.
– [Earliest start time] Consider jobs in ascending order of start time sj.
The University of Sydney Page 10

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 11

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 12

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 13

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 14

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 15

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 16

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 17

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 18

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 19

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
G
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 20

Interval Scheduling – [Earliest start time]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
A
G
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 21

Interval Scheduling: Greedy Algorithms
Greedy template. Consider jobs in some order. Take each job provided it is compatible with the ones already taken.
breaks [Earliest start time]
The University of Sydney Page 22

Interval Scheduling: Greedy Algorithms
Greedy template. Consider jobs in some order. Take each job provided it is compatible with the ones already taken.
– [Earliest start time] Consider jobs in ascending order of start time sj.
– [Shortest interval] Consider jobs in ascending order of interval length fj – sj.
The University of Sydney Page 23

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 24

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 25

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 26

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 27

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 28

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 29

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 30

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 31

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 32

Interval Scheduling – [Shortest interval]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
C
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 33

Interval Scheduling: Greedy Algorithms
Greedy template. Consider jobs in some order. Take each job provided it is compatible with the ones already taken.
breaks [Earliest start time] breaks [Shortest interval]
The University of Sydney Page 34

Interval Scheduling: Greedy Algorithms
Greedy template. Consider jobs in some order. Take each job provided it is compatible with the ones already taken.
– [Earliest start time] Consider jobs in ascending order of start time sj.
– [Shortest interval] Consider jobs in ascending order of interval length fj – sj.
– [Fewest conflicts] For each job, count the number of conflicting jobs cj. Schedule in ascending order of conflicts cj.
The University of Sydney Page 35

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 36

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 37

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 38

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 39

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 40

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 41

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 42

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 43

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 44

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
E
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 45

Interval Scheduling – [Fewest Conflicts]
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
E
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 46

Interval Scheduling: Greedy Algorithms
Greedy template. Consider jobs in some order. Take each job provided it is compatible with the ones already taken.
breaks [Earliest start time] breaks [Shortest interval]
breaks [Fewest conflicts]
The University of Sydney Page 47

Interval Scheduling: Greedy Algorithms
Greedy template. Consider jobs in some order. Take each job provided it is compatible with the ones already taken.
– [Earliest start time] Consider jobs in ascending order of start time sj.
– [Shortest interval] Consider jobs in ascending order of interval length fj – sj.
– [Fewest conflicts] For each job, count the number of conflicting jobs cj. Schedule in ascending order of conflicts cj.
– [Earliest finish time] Consider jobs in ascending order of finish time fj.
The University of Sydney Page 48

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 49

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 50

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
C
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 51

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
AB
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 52

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
E
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 53

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
DE
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 54

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
E
F
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 55

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
E
G
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 56

Interval Scheduling
A
B
C
D
E
F
G
H
0 1 2 3 4 5 6 7 8 9 10 11
Time
B
E
H
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney Page 57

Interval Scheduling: Greedy Algorithm
Only [Earliest finish time] remains to be tested.
– Greedy algorithm. Consider jobs in increasing order of finish time.
Take each job provided it is compatible with the ones already taken.
Sort jobs by finish times so that f1 £ f2 £ … £ fn. jobs selected
A®
for j = 1 to n {
if (job j compatible with A) A ¬ A È {j}
}
return A
– Implementation. O(n log n).
– Rememberjobj*thatwasaddedlasttoA. – Job j is compatible with A if sj 3 fj*.
The University of Sydney Page 58

Interval Scheduling: Analysis
High-Level General Idea:
At each step of Greedy, there is an optimal solution consistent with Greedy’s choices so far
One way to do this is by using an exchange argument.
1. Define your greedy solution.
2. Compare solutions. If Xgreedy 1 Xopt, then they must differ in some specific way.
3. Exchange Pieces. Transform Xopt to a solution that is “closer” to Xgreedy and prove cost doesn’t increase.
4. Iterate. By iteratively exchanging pieces one can turn Xopt into Xgreedy without impacting the quality of the solution.
The University of Sydney Page 59

Interval Scheduling: Analysis
– Theorem: Greedy algorithm [Earliest finish time] is optimal. – Proof: (by contradiction)
– Assume greedy is not optimal, and let’s see what happens.
– Let i1, i2, … ik denote the set of jobs selected by greedy.
– Claim There exists an optimal solution containing i1, i2, … ik
– Given claim, let J1, J2, … Jm be an optimal solution with
i1 =J1,i2=J2,…,ik =Jk
Greedy: OPT:
i1
i2
ik
J1
J2
Jk
Jk+1

The University of Sydney
Page 60
Why did Greedy not take Jk+1?
Contradiction!

Interval Scheduling: Analysis
– Claim: There exists an optimal solution containing i1, i2, … Ik
– Need a stronger claim so that we can use induction
– Claim++: For every 1 ≤ r ≤ k, there exists an optimal solution containing i1, i2, … ir.
– Inductive proof by exchange argument
The University of Sydney Page 61

Interval Scheduling: Analysis
– Claim++: For every 1 ≤ r ≤ k, there exists an optimal solution containing i1, i2, … ir.
– Proof: (by exchange argument)
– Base case (r =1):
– Let J1, J2, … Jm be an optimal solution
– Other than J1, other jobs of OPT start later than finish time of J1 – By def. of Greedy, i1 finishes earlier than J1
– Thus, we can swap J1 for i1 and solution still feasible and optimal.
Greedy: OPT:
Can swap J1 for i1 The University of Sydney
Page 62
i1
i2
ik
J1
J2
Jk
Jk+1

Interval Scheduling: Analysis
– Claim++: For every 1 ≤ r ≤ k, there exists an optimal solution containing i1, i2, … ir.
– Proof: (by exchange argument)
– Inductive case (r > 1):
– Assume true for r – 1. Let J1, J2, … Jm be an optimal solution with i1 = J1, i2 = J2, …, ir-1 = Jr-1
– By def. of Greedy, ir finishes earlier than Jr
– Thus, we can swap Jr for ir and solution still feasible and optimal.
Greedy: OPT:
i1
i2
ir – 1
ir
J1
J2
Jr – 1
Jr


The University of Sydney
Page 63
Can swap Jr – 1 for ir – 1

Interval Scheduling: Recap of Exchange Argument
Greedy: OPT:
– We have an optimal solution that is “closer” to the greedy solution.
– Start the argument over again, but now the first (r+1) elements of the greedy solution and the optimal solution are identical.
– Continue iteratively until the optimal solution is transformed into the greedy solution without increasing the cost.
i1
i1
ir
ir+1

J1
J2
Jr
Jr+1

The University of Sydney Page 66

Interval Scheduling
There exists a greedy algorithm [Earliest finish time] that computes an optimal solution in O(n log n) time.
The University of Sydney Page 67

Interval Partitioning
The University of Sydney Page 68

Interval Partitioning
– Interval partitioning.
– Lecture i starts at si and finishes at fi. Assume integers.
– Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room.
– Ex: This schedule uses 4 classrooms to schedule 10 lectures.
c
e
j
d
g
b
h
a
f
i
9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30
Time
The University of Sydney Page 69

Interval Partitioning
– Interval partitioning.
– Lecture i starts at si and finishes at fi.
– Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room.
– Ex: This schedule uses only 3.
c
d
f
j
b
g
i
a
e
h
9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30
Time
The University of Sydney Page 70

Interval Partitioning: Lower bound
– Definition: The depth of a set of open intervals is the maximum number that contain any given time.
– Observation: Number of classrooms needed 3 depth.
– Example: Depth of schedule below is 3 Þ schedule below is optimal.
(a, b, c all contain 9:30)
– Question: Does there always exist a schedule equal to depth of intervals?
c
d
f
j
b
g
i
a
e
h
The University of Sydney
Page 71
9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30
Time

Interval Partitioning: Greedy Algorithm
– Greedy algorithm. Consider lectures in increasing order of start time: assign lecture to any compatible classroom.
Sort intervals by starting time so that s1 £ s2 £ … £ sn. d ¬ 0 number of allocated classrooms
for i = 1 to n {
if (lecture i is compatible with some classroom k)
schedule lecture i in classroom k
else
allocate a new classroom d + 1 schedule lecture i in classroom d + 1 d¬d+1
}
The University of Sydney Page 72

Interval Partitioning: Greedy Algorithm
– Greedy algorithm. Consider lectures in increasing order of start time: assign lecture to any compatible classroom.
Sort intervals by starting time so that s1 £ s2 £ … £ sn.
d ¬ 0 number of allocated classrooms
for i = 1 to n {
if (lecture i is compatible with some classroom k)
schedule lecture i in classroom k
else
allocate a new classroom d + 1 schedule lecture i in classroom d + 1 d¬d+1
}
– Implementation. O(n log n).
– Foreachclassroomk,maintainthefinishtimeofthelastjobadded. – Keeptheclassroomsinapriorityqueue.
The University of Sydney
Page 73

Interval Partitioning: Greedy Analysis
– Observation: Greedy algorithm never schedules two incompatible lectures in the same classroom.
– Theorem: Greedy algorithm is optimal.
– Proof:
– d = number of classrooms that the greedy algorithm allocates.
– Classroom d is opened because we needed to schedule a job, say i, that is incompatible with all d-1 other classrooms.
– Since we sorted by start time, all these incompatibilities are caused by lectures that start no later than sj. just after
– Thus, we have d lectures overlapping at time [sj ,sj + 1]. time si
– Key observation Þ all schedules use 3 d classrooms.
The University of Sydney
Page 74
[Greedy algorithm stays ahead]
􏰀

Interval Partitioning
There exists a greedy algorithm [Earliest starting time] that computes the optimal solution in O(n log n) time.
The University of Sydney Page 75

Scheduling to Minimize Lateness
The University of Sydney Page 76

Scheduling to Minimizing Lateness
– Minimizing lateness problem. [No fix start time]
– Single resource processes one job at a time.
– Job i requires ti units of processing time and is due at time di.
– Due times are unique
– Ifistartsattimesi,itfinishesattimefi =si +ti.
– Lateness: !i =max{0, fi -di }.
– Goal: schedule all jobs to minimize maximum lateness L = max !j.
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
– Ex: lateness = 0
jobs
processing time due time
lateness = 0
lateness = 0
lateness = 2
lateness = 0
max lateness = 5
12 13 14 15
d3 = 9
d2 = 8
d6 = 15
d1 = 6
d5 = 14
d4 = 10
0 1 2 3 4 5 6 7 8 9 10 11
The University of Sydney
Page 77

Minimizing Lateness: Greedy Algorithms
Greedy template. Consider jobs in some order.
– [Shortest processing time first] Consider jobs in ascending order of processing time ti.
– [Smallest slack] Consider jobs in ascending order of slack di – ti.
– [Earliest deadline first] Consider jobs in ascending order of deadline di.
The University of Sydney Page 78

Minimizing Lateness: Greedy Algorithms
– Greedy template. Consider jobs in some order.
– [Shortest processing time first] Consider jobs in ascending order of
processing time ti.
counterexample
– [Smallest slack] Consider jobs in ascending order of slack di – ti. counterexample
The University of Sydney
Page 79
ti
1
1
2
10
di
100
10
1
2
ti
1
10
di
2
10

Minimizing Lateness: Greedy Algorithm
– Greedy algorithm. [Earliest deadline first]
Sortnjobsbydeadlinesothatd1 £ d2 £ … £ dn
t¬0
for j = 1 to n
Assign job j to interval [t, t + tj] sj ¬ t, fj ¬ t + tj
t ¬ t + tj
output intervals [sj, fj]
jobs
processing time due time
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
The University of Sydney
Page 80

Minimizing Lateness: Greedy Algorithm
– Greedy algorithm. [Earliest deadline first]
Sortnjobsbydeadlinesothatd1 £ d2 £ … £ dn
t¬0
for j = 1 to n
Assign job j to interval [t, t + tj] sj ¬ t, fj ¬ t + tj
t ¬ t + tj
output intervals [sj, fj]
jobs
processing time due time
d1 = 6
0 1 2 3 4 5 6 7 8 9 10 11
12 13 14 15
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
The University of Sydney
Page 81

Minimizing Lateness: Greedy Algorithm
– Greedy algorithm. [Earliest deadline first]
Sortnjobsbydeadlinesothatd1 £ d2 £ … £ dn
t¬0
for j = 1 to n
Assign job j to interval [t, t + tj] sj ¬ t, fj ¬ t + tj
t ¬ t + tj
output intervals [sj, fj]
jobs
processing time due time
0 1 2 3 4 5 6 7 8 9 10 11
12 13 14 15
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
d1 = 6
d2 = 8
The University of Sydney
Page 82

Minimizing Lateness: Greedy Algorithm
– Greedy algorithm. [Earliest deadline first]
Sortnjobsbydeadlinesothatd1 £ d2 £ … £ dn
t¬0
for j = 1 to n
Assign job j to interval [t, t + tj] sj ¬ t, fj ¬ t + tj
t ¬ t + tj
output intervals [sj, fj]
jobs
processing time due time
0 1 2 3 4 5 6 7 8 9 10 11
12 13 14 15
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
d1 = 6
d2 = 8
d3 = 9
The University of Sydney
Page 83

Minimizing Lateness: Greedy Algorithm
– Greedy algorithm. [Earliest deadline first]
Sortnjobsbydeadlinesothatd1 £ d2 £ … £ dn
t¬0
for j = 1 to n
Assign job j to interval [t, t + tj] sj ¬ t, fj ¬ t + tj
t ¬ t + tj
output intervals [sj, fj]
jobs
processing time due time
max lateness = 1
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
d1 = 6
d2 = 8
d3 = 9
d4 = 10
0 1 2 3 4 5 6 7 8 9 10 11
12 13 14 15
The University of Sydney
Page 84

Minimizing Lateness: Greedy Algorithm
– Greedy algorithm. [Earliest deadline first]
Sortnjobsbydeadlinesothatd1 £ d2 £ … £ dn
t¬0
for j = 1 to n
Assign job j to interval [t, t + tj] sj ¬ t, fj ¬ t + tj
t ¬ t + tj
output intervals [sj, fj]
jobs
processing time due time
max lateness = 1
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
d1 = 6
d2 = 8
d3 = 9
d4 = 10
d5 = 14
0 1 2 3 4 5 6 7 8 9 10 11
12 13 14 15
The University of Sydney
Page 85

Minimizing Lateness: Greedy Algorithm
– Greedy algorithm. [Earliest deadline first]
Sortnjobsbydeadlinesothatd1 £ d2 £ … £ dn
t¬0
for j = 1 to n
Assign job j to interval [t, t + tj] sj ¬ t, fj ¬ t + tj
t ¬ t + tj
output intervals [sj, fj]
jobs
processing time due time
max lateness = 1
1
2
3
4
5
6
ti
3
2
1
4
3
2
di
6
8
9
10
14
15
d1 = 6
d2 = 8
d3 = 9
d4 = 10
d5 = 14
d6 = 15
0 1 2 3 4 5 6 7 8 9 10 11
12 13 14 15
The University of Sydney
Page 86
Algorithm ignores processing time!

Minimizing Lateness: No Idle Time
– Observation: There exists an optimal schedule with no idle time.
d = 4 d = 6 d = 12
0 1 2 3 4 5 6 7 8 9 10 11
0 1 2 3 4 5 6 7 8 9 10 11
– Observation: The greedy schedule has no idle time. The University of Sydney
Page 87
d=4
d=6
d = 12

Minimizing Lateness: Inversions
– Definition: An inversion in schedule S is a pair of jobs i and k such that i < k (by deadline) but k is scheduled before i. inversion – Observation: Greedy schedule has no inversions. Moreover, Greedy is only such schedule (by uniqueness of deadlines). – Observation: If a schedule (with no idle time) has an inversion, it has one with a pair of inverted jobs scheduled consecutively. k i The University of Sydney Page 88 Minimizing Lateness: Inversions – Definition: An inversion in schedule S is a pair of jobs i and k such that i < k (by deadline) but k is scheduled before i. inversion fi f'j k i before swap after swap i k – Claim: Swapping two adjacent, inverted jobs reduces the number of inversions by one and does not increase the max lateness. The University of Sydney Page 89 Minimizing Lateness: Inversions – Definition:AninversioninscheduleSisapairofjobsiandksuch that i < k (by deadline) but k is scheduled before i. before swap after swap inversion fi f‘k k i i k – Claim: Swapping two adjacent, inverted jobs reduces the number of inversions by one and does not increase the max lateness. – Proof: Let ! be the lateness before the swap, and let !' be the lateness after the swap. – !‘x =!x forallx1i,k – !'i £ !i – Ifjobkislate: !¢ = f ¢ - d kkk = fi - dk £ fi-di £ ! i (definition) (i finishes at time fi ) (i cost(ej)) return false else if (i < j) return true else return false } Implementation. Can handle arbitrarily small perturbations implicitly by breaking ties lexicographically, according to index. The University of Sydney Page 136 Shortest Paths in a Graph Shortest path from SIT to the Sydney Opera house The University of Sydney Page 137 Shortest Path Problem – Shortest path network. – Directed graph G = (V, E). – Source s, destination t. – Length !e = length of edge e. – Shortest path problem: find shortest directed path from s to t. cost of path = sum of edge costs in path 9 s 15 2 23 3 18 The University of Sydney 5 7 44 t Page 138 14 6 6 11 4 19 5 20 16 6 2 Cost of path s-2-3-5-t = 9+23+2+16 = 50 30 Recall: How to design algorithms Step 1: Understand problem Step 4: Better understanding of problem Step 2: Start with simple alg. Step 5: Improved alg. Step 3: Does it work? Is it fast? No Yes Problem Algorithm Analysis The University of Sydney Page 139 DONE! What is simple alg. for Shortest Paths Problem? – Previous problems have easy-to-guess simple algorithms – For shortest paths problem, the first try is to transform it into a different problem that we already know how to solve The University of Sydney Page 140 Shortest Path Problem (Unit Length Edges, Undirected) – Shortest path network. – Undirected graph G = (V, E). – Source s, destination t. – Length!e =lengthofedgee=1. Can solve this with BFS! – Shortest path problem: find shortest path from s to t. cost of path = number of edges in path 23 s Cost of path s-7-t = 1+1 =2 6 5 4 The University of Sydney 7t Page 141 Shortest Path Problem (Unit-Length Edges) – Shortest path network. – Directed graph G = (V, E). – Source s, destination t. – Length!e =lengthofedgee=1. Can solve this with directed BFS! – Shortest path problem: find shortest directed path from s to t. cost of path = number of edges in path 23 s Cost of path s-7-t = 1+1 =2 6 5 4 The University of Sydney 7t Page 142 Shortest Path Problem (First try) – Shortest path network. – Directed graph G = (V, E). – Source s, destination t. – Length !e = length of edge e. – Algo: – Subdivide each edge e into !e edges of unit length – Run directed BFS 9 s 15 2 23 3 18 14 6 6 11 4 19 2 30 The University of Sydney 5 7 44 t Page 143 5 20 16 6 Shortest Path Problem (First try) – Shortest path network. – Directed graph G = (V, E). – Source s, destination t. – Length !e = length of edge e. Input size = O(n + m log L) Issue: Running time = O(L(m+ n)) where L is max edge length – Algo: – Subdivide each edge e into !e edges of unit length – Run directed BFS – Exercise: Shortest path in subdivided graph ↔ Shortest path in original graph 9 s 15 2 23 3 18 1 14 6 86 30 1 11 4 19 5 5 7 44 t 20 16 6 The University of Sydney Page 144 Dijkstra's Algorithm – Compute shortest path distances to every vertex. – Dijkstra's algorithm. – Maintain a set of explored nodes S for which we have determined the shortest path distance d(u) from s to u. – Initialize S = {s}, d(s) = 0. – Repeatedly choose unexplored node v which minimizes p(v) = min e = (u,v) : uÎS add v to S, and set d(v) = p(v). d(u) + !e , !e u shortest path to some u in explored part, followed by a single edge (u, v) v d(u) S s The University of Sydney Page 145 Dijkstra's Algorithm – Compute shortest path distances to every vertex. – Dijkstra's algorithm. – Maintain a set of explored nodes S for which we have determined the shortest path distance d(u) from s to u. – Initialize S = {s}, d(s) = 0. – Repeatedly choose unexplored node v which minimizes p(v) = min e = (u,v) : uÎS add v to S, and set d(v) = p(v). d(u) + !e , !e u shortest path to some u in explored part, followed by a single edge (u, v) v d(u) S s The University of Sydney Page 146 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). ¥ shortest path to some u in explored part, followed by a single edge (u, v) ¥ 09 s 2 24 3 14 6 6 ¥ 18 2 ¥ 30 ¥ 11 4 19 15 5 5 20 6 16 7 44 t The University of Sydney ¥¥ Page 147 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 shortest path to some u in explored part, followed by a single edge (u, v) ¥ 09 s 2 24 3 14 14 6 18 2 6 ¥ 30 ¥ 11 4 19 15 5 5 20 6 16 7 44 t The University of Sydney 15 ¥ Page 148 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 shortest path to some u in explored part, followed by a single edge (u, v) ¥ 09 s 2 24 3 14 14 6 18 2 6 ¥ 30 ¥ 11 4 19 15 5 5 20 7 44 16 6 t ¥ The University of Sydney 15 Page 149 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 shortest path to some u in explored part, followed by a single edge (u, v) 33 09 s 2 24 3 14 14 6 18 2 6 ¥ 30 ¥ 11 4 19 15 5 5 20 7 44 16 6 t ¥ The University of Sydney 15 Page 150 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 shortest path to some u in explored part, followed by a single edge (u, v) 33 09 s 2 24 3 14 14 6 18 2 6 ¥ 30 ¥ 11 4 19 15 5 5 20 7 44 16 6 t ¥ The University of Sydney 15 Page 151 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 2 24 3 14 14 6 18 2 ¥ 6 3044 114 15 5 5 19 6 t ¥ 20 7 44 16 The University of Sydney 15 Page 152 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 2 24 3 14 14 6 18 2 ¥ 6 3044 114 15 5 5 19 6 t ¥ 20 7 44 16 The University of Sydney 15 Page 153 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 2 24 3 14 14 6 18 2 ¥ 6 30 4X435 11 4 15 5 5 19 6 t 59 20 7 44 16 The University of Sydney 15 Page 154 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 2 24 3 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 18 2 ¥ 30 44 35 4 X 11 19 14 14 6 6 15 5 5 20 7 44 16 6 t 59 Page 155 The University of Sydney 15 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 2 24 3 14 14 6 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 18 2 6 XX11 19 34 ¥ 30 44 35 4 15 5 5 20 7 44 16 6 t The University of Sydney 15 59 51 X Page 156 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 2 24 3 14 14 6 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 18 2 6 XX11 19 34 ¥ 30 44 35 4 15 5 5 20 7 44 16 6 t The University of Sydney 15 59 51 X Page 157 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 2 24 3 14 14 6 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 18 2 6 XX11 19 45 30 44 35 4 34 15 5 5 20 7 44 The University of Sydney 15 16 6 t 5 9 X5 1 5 0 X Page 158 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 2 24 3 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 18 2 14 14 6 6 X11 19 34 20 7 44 The University of Sydney 15 45 4 30 4X435 15 5 5 6 t 59 5X1 50 X 16 Page 159 Dijkstra's Shortest Path Algorithm In each iteration, choose v not in S that minimizes p(v) = min d(u) + !e , e = (u,v) : uÎS add v to S, and set d(v) = p(v). 9 2 24 3 shortest path to some u in explored part, followed by a single edge (u, v) X 32 33 09 s 18 2 14 14 6 6 X11 19 34 20 7 44 The University of Sydney 15 45 4 30 4X435 15 5 5 6 t 16 5X9 51 50 X Page 160 Dijkstra's Algorithm: Proof of Correctness – Invariant: For each node u Î S, d(u) is the length of the shortest s-u path. – Proof: (by induction on |S|) Base case: |S| = 1 is trivial. Inductive hypothesis: Assume true for |S| = k > 1.
– Let v be next node added to S, and let (u,v) be the chosen edge.
– Claim The shortest s-u path plus (u, v) is an s-v path of length p(v).
– Pf Consider any s-v path P. We’ll show that it’s no shorter than p(v).
– Let x-y be the first edge in P that leaves S, and let P’ be the subpath to x.
– PisalreadytoolongassoonasitleavesS. s !(P) 3 !(P’) + !(x,y) 3 d(x) + !(x,y) 3 p(y) 3 p(v)
y
P
v
P’
x u
The University of Sydney
Page 161
inductive hypothesis
S instead of y
defn of p(y)
Dijkstra chose v

Dijkstra’s Algorithm: Implementation
– For each unexplored node, explicitly maintain p(v)= min d(u)+”e .
!! e = (u,v) : uÎS
– Next node to explore = node with minimum p(v).
– When exploring v, for each incident edge e = (v, w), update p(w) = min { p(w), p(v)+”e }.
– Efficient implementation. Maintain a priority queue of unexplored nodes, prioritized by p(v).
!!
Priority Queue
PQ Operation
Dijkstra
Array
Binary heap
d-way Heap
Fib heap †
Insert
n
n
log n
d log d n
1
ExtractMin
n
n
log n
d log d n
log n
ChangeKey
m
1
log n
logd n
1
IsEmpty
n
1
1
1
1
Total
n2
m log n
m log m/n n
m + n log n
The University of Sydney † Individual ops are amortized bounds Page 162

Shortest Path
The shortest path between two vertices in a graph G with n vertices and m nodes can be computed in O(m+n log n)
time.
The University of Sydney Page 182

Summary: Greedy algorithms
A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding a global optimum.
Problems
– Interval scheduling/partitioning
– Scheduling: minimize lateness
– Shortest path in graphs (Dijkstra’s algorithm) – Minimum spanning tree (Prim’s algorithm) –…
The University of Sydney
Page 183

This Week
– Tutorial 3:
– Posted tonight
– Quiz 2:
– Posted tonight
– Due Wednesday 20 March 23:59:00 • One try
• 20 minutes from the time quiz is opened
• No late submissions accepted – Assignment 1:
– Due Wednesday 20 March 23:59:00
– No submissions later than Wednesday 27 March 23:59:00 accepted – Assignment 2:
– Posted Wednesday 21 March
– Due Wednesday 10 March 23:59:00
The University of Sydney Page 184