程序代写代做代考 chain algorithm Algorithms and Data

Algorithms and Data
17: Mergeable Heaps Professor Kevin Gold


The binary heaps we saw in covering greedy algorithms are very nice except when it comes to merging them — an O(N) operation
Mergeable Heaps
We’ll see two heaps today that improve this time to and Fibonacci heaps

O(lg N) and O(1), respectively — binomial heaps

Fibonacci heaps have the best amortized time, constant for many operations, but are challenging to implement




Binomial Trees Binomial trees are defined recursively:
Binomial tree B0 is a tree with just one node.
The name comes from the number of nodes at depth i of tree Bk-1 … it’s ( ki )
B0 B1 B2 B3
Binomial tree Bk consists of two binomial trees Bk-1 with the root of one

added as a child to the root of the other.

Properties of Binomial Trees








Bk has 2k nodes.
Clear by induction if Bk-1 has 2k-1 nodes and we combine 2 of them
The height of Bk is k.
Also clear by induction because each Bk hangs lower by one than the last
The root has the most children, and its children’s subtrees are Bk-1 … B0. Also straightforward by induction: each step get a Bk-1 added
Max degree of any node is lg n.
Root has max degree, and if children Bk-1 … B0 then at most lg n of them.


The first property means there are at most floor(lg n) + 1 trees in the heap — 
 each Bk contributes 2k nodes for a different k
Binomial Heaps
A binomial heap is actually a collection of binomial trees, with no more than one

binomial tree Bk per value of k.
Each binomial tree is a heap obeying the “min-heap” property from earlier in the

course: parents no greater than children
The trees that exist correspond to the 1’s in the binary representation of n: 


5 nodes => 0101 => B2 and B0 are in the heap
B0 2 B2 8
3 65

Finding the Minimum in the Binomial Heap
All the binomial trees obey the heap property, so to

find the min, just take the min of all of those.
Since there are floor(lg n) + 1 binomial trees in the

heap, that’s how many operations this takes.
Removing this node is a different story — we’ll get

to that.


trees Bk for a given k.



other, forming a Bk+1 tree
Unioning Two 
 Binomial Heaps
First, “merge” the two lists of trees to create a single list of trees, with at most two
Next, step through this list of trees and perform “addition” of the trees for each k. If single Bk for given k, leave it alone.
If two Bk for given k, make the tree with the smaller root the parent of the
If three Bk (only possible if we “carried” from the previous k), leave one alone

and combine the other two to make a Bk+1 tree.
2
3
B0 B1 +B0 B1 =B1 B2
3 5
5
7
1
2
7
1
10
10
O(lg N) since that’s the number of roots


Insertion is Union With a Single-Node Heap
To insert a node into a binomial heap, create a new heap containing just that element and union — 

so worst case O(lg n)
B2 +B7 =
2
2
3
B0 B1
0
7 5
5
3
(combine 2 and 7, then 2 root with 3 root)

Amortized Cost of Insert is O(1), not O(lg N)
We don’t actually do lg N work every time we insert a node —

that’s only if we need to “carry” on every tree
• • •
Let the potential function be Φ(N) = # of trees t in binomial heap. Let m be the number of trees merged during the insert
The actual cost c of an insert is 1 + m 

(merge until finally inserting a tree with no partner)
The potential change Φ(N) – Φ(N’) is 1 – m 
 Amortized cost c + Φ(N) – Φ(N’) = 2.

(1 for the tree we add)


B1
7
Extract 1
B1
B0 B2
Extract-Min is Union of the Min’s Children With the Heap
Recall all the children of Bk’s root are themselves binomial trees Bk-1 … B0.
Treat these children as a heap unto themselves and union with the rest of Because it’s a union, O(lg N).
Suppose we remove the min of the heap. How do we fix the now-rootless

binomial tree it was root of?


the heap.
2
1
B2
3 5
2 102 +=
10
7
5
3
B10 37 0
5
B1


Decrease-Key and Delete in Binomial Heaps
To decrease a key — if we’re using these for Dijkstra’s algorithm, for example — just change the key and heapify-up as with normal heaps.
To delete a node, decrease-key to -infinity to bring

it to the surface, then extract-min.

Sample Application: Different MST Algorithm
BinomialHeapMST(G):

Initialize sets of vertices Vi to individual vertices vi
 For each vi

For each edge eij incident to vi

Add this edge to binomial heap Ei

While there is more than one set Vi
 Choose an arbitrary Vi

eij = ExtractMin(Ei)

if i and j are in different vertex sets Vi and Vj
 Add eij to MST T

Merge Vj into Vi
 Merge Ej into Ei

Return T
Could do this in parallel
(if we have synchronization here)
Needs mergeable heap to be efficient

Fibonacci Heaps
O(|E| + |V| log |V|) time achievable for Dijkstra’s
Fibonacci heaps are mergeable heaps that are faster in amortized time than

binary heaps

The high constants involved and difficulty of programming them makes them

a little impractical — “predominantly of theoretical interest,” says CLRS
But they are an example of clever amortized design and achieve the best

asymptotic running times
Overall idea: “messy” binomial heaps that don’t get cleaned up until they

have to be
“Fibonacci heap” because the minimum size sk of a tree with root degree k

scales like the Fibonacci numbers: sk ≥ Fk+2

Structure of a 
 Fibonacci Heap
Before deletions, Fibonacci heaps are like binomial heaps, but the binomial trees

are unordered — children of the root are Bk-1 … B0 in any order
Still obey min-heap property: smaller elements parents of larger

With deletions, the trees are not even necessarily binomial trees, and particular

nodes are marked (purple below) to indicate a child was lost there.
Siblings are linked to each other in doubly-linked lists, and “min” pointer points to

overall min.
min
2
1
Linked lists wrap
7
10
3
5
4
12
13
8
6
3
Tree sizes may be temporarily repeated
5
9 11

Inserting a Node
Into a Fibonacci Heap
Unlike with binomial heaps, the Fibonacci heap is

lazy, leaving cleanup for extract-min
Add a node to the root list, and if it’s smaller than

the min, make min point to it. Clearly O(1).
min
insert 1
77
min
3
2
5
6
3
2
6
1
10
4
10
4
5

Unioning the Fibonacci
Heap
We remain lazy: combine the root lists, make min

point to the smaller of the two mins. That’s it.
Again O(1), because these are just pointer

manipulations.
min 7+=7
min min
6
3
2
6
8
10
4
12
14
5
16
3
2
8
10
4
12
14
5
16

Extracting the Min from a
Fibonacci Heap
As with binomial heaps, lop off the min root and add its children to the

root list.
But this time, we’re not lazy: consolidate roots with equal degree
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge and try again at degree++
min
7
min=?
7
3
2
6
8
10
4
12
14
5
16
3
6
8
10
4
12
14
5
16

Extracting the Min from a Fibonacci Heap
But this time, we’re not lazy: consolidate roots with equal degree 
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge (smaller parent of larger) and try again at degree++
Degree 01234 A
7
(This is an array of references, not values)
‘3’
3
6
8
10
4
12
14
5
16

Extracting the Min from a Fibonacci Heap
But this time, we’re not lazy: consolidate roots with equal degree
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge (smaller parent of larger) and try again at degree++
Degree 01234 A
7
(This is an array of references, not values)
‘6’
‘3’
3
6
8
10
4
12
14
5
16

Extracting the Min from a Fibonacci Heap
But this time, we’re not lazy: consolidate roots with equal degree 
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge (smaller parent of larger) and try again at degree++
Degree 01234 A
7
(This is an array of references, not values)
‘6’
‘3’
‘8’
3
6
8
10
4
12
14
5
16

Extracting the Min from a Fibonacci Heap
But this time, we’re not lazy: consolidate roots with equal degree 
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge (smaller parent of larger) and try again at degree++
Degree 01234 A
‘10’
(This is an array of references, not values)
‘6’
‘3’
‘8’
3
6
8
10
4
7
12
14
5
16

Extracting the Min from a Fibonacci Heap
But this time, we’re not lazy: consolidate roots with equal degree 
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge (smaller parent of larger) and try again at degree++
Degree 01234 A
‘6’
(This is an array of references, not values)
‘3’
‘8’
3
6
8
4
7
10
12
14
5
16

Extracting the Min from a Fibonacci Heap
But this time, we’re not lazy: consolidate roots with equal degree 
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge (smaller parent of larger) and try again at degree++
Degree 01234 A
‘3’
(This is an array of references, not values)
‘8’
3
8
4
7
6
12
14
5
10
16

Extracting the Min from a Fibonacci Heap
But this time, we’re not lazy: consolidate roots with equal degree 
 Keep an array of roots indexed by degree; while there is a collision,

(# children), as in binomial tree union, until every root has distinct degree

merge (smaller parent of larger) and try again at degree++
Degree 01234 A
(This is an array of references, not values)
‘4’
‘3’
3
4
8
7
6
5
12
14
10
Done consolidating
16

Extracting the Min from a Fibonacci Heap
When consolidation is done, iterate through the roots

looking for the min.
min
3
4
8
7
6
5
12
14
10
16

Side Note About
 Marked Nodes
When nodes are joined in the consolidation step of Extract-Min, if marked node A becomes a child of node B, it becomes unmarked.
36 7 10

8
4
3
8
4
12
14
5
7
12
14
5
16
6
10
16

Introducing a Potential
Function
We’ll use the potential function Φ(H) = t(H) + 2m(H)

where t(H) is the number of trees at root level and m(H) is the number of “marked nodes”
We haven’t discussed where marked nodes come from yet, but it won’t matter for now since unmarking nodes only decreases the potential, decreasing the amortized cost.
Core ideas: extract-min can’t be expensive often, because merging gets rid of trees; we already count work proportional to this expense with inserts
Extract-min is exactly the kind of operation that will only

occasionally involve a lot of work, suggesting amortized analysis.


Potential Function Analysis
of Insert and Union
Need to ensure our potential function doesn’t give

strange costs for straightforward operations
• •
Φ(H) = t(H) + 2m(H)
Insert: Actual cost O(1), potential change 1 since we gain a tree (no mark change), so amortized cost 
 O(1) + 1 = O(1).
Union: Actual cost O(1), potential change 0 because number of trees & marks doesn’t change from the individual heaps

• • •

Potential Function Analysis
of Extract-Min Φ(H) = t(H) + 2m(H)
Let D(n) be the maximum degree of any root for a heap of size n Actual cost is O(D(n) + t(H)) — one merge per original tree plus each
child of the extracted min Original potential t(H) + 2m(H)
No two trees have same degree, D(n) is by definition the maximum
real cost new potential old potential
New potential at most D(n) + 1 + 2m(H): no changes in marks (at

worst), at most D(n) + 1 trees

So amortized cost is O(D(n)+t(H)) + D(n) + 1 + 2m(H) – (t(H)+2m(H))


= O(D(n)). We’ll see this is actually O(log n).




Decrease-Key in a
Fibonacci Heap
Given a pointer to a node in a Fibonacci heap to decrease, decrease the key and If it is, cut the node (move it to root level) and unmark it.

check whether the key is now smaller than the parent’s key.
If it’s smaller than the heap min, make it the min. If the parent is not marked, mark it and done.
decrease min decreasemin
12 to 2 2 3 14 to 4 2 3 8
8
14
If the parent is already marked, cut and unmark the parent and recur to its parent,

doing the same thing; recur up the chain until there’s no need to cut a parent.
min
3
4
8
7
6
7
6
7
6
12
14
10
10
10

Amortized Cost of
Decrease-Key
The actual cost is the number of cuts that need to be made

going up the tree, O(c), plus a small constant
The change in potential is a gain of c trees from cuts and a

loss of at least c-2 marks

The first cut node may not have been marked and the final The amortized cost is therefore O(c) + c – 2(c-2) = O(1) cost.

uncut parent may become marked
(Units of potential can be scaled up to equal big-O

constants.)


Deleting a node consists of decreasing the key to be lower than all other nodes, then calling Extract- min.
Deleting a Node
This is O(1) amortized time plus O(D(n)) = O(lg(n))

amortized time, so O(lg(n)).


Analysis of the
 Maximum Degree (Part I)
We showed Extract-Min has a running time of O(D(n)), the maximum degree (# children) in the heap — we’ll now show this is O(lg n)
Proposition: For any node in a Fibonacci heap x,

let y1 … yk be its children in the order they were added. Then degree(y1) >= 0 and degree(yi) >= i-2 for i >= 2.
Proof: degree(y1) >= 0 is clear. yi was linked to x only if degree(yi) == degree(x) at the time, and degree(x) was i-1. Since then, yi lost at most one child or it would be cut. So degree(yi) >= i -2.

A Diversion into
 Fibonacci Numbers
Let Fk be the kth Fibonacci number (F0 = 0, F1 = 1, Fk = Fk-1 k

+ Fk-2). We can show via induction that Fk+2 = 1 + Σi=0 Fi.
We can also prove via induction that Fk+2 ≥ φk where 

We’ll show that the size of the subtree rooted at x is at least


φ = (1 + √5)/2, the Golden Ratio

Fk+2 ≥ φk where k is the degree of x
If n ≥ size(x) ≥ φk then taking the logφ of both sides gives 


k ≤ logφ n and D(n) is O(log n) as required


Call the minimum size of a tree with root degree k, sk. k
Analysis of the
 Maximum Degree (Part II)
sk = 2 + Σi=2 sdegree[yi] (2 for root and first child)
 k

≥ 2 + Σi=2 si-2 (degree yi ≥ i-2)

Prove by induction that sk ≥ Fk+2. Clearly true for F0 = 0, F1 = 1.
Assume true for i = 0 … k-1 and plug into equation above:


= Fk+2 by formula on last slide, proving induction.
So n ≥ sk ≥ Fk+2 ≥ φk and k ≤ logφ n, proving D(n) = O(log n).
k
sk ≥ 2 + Σi=2 si-2 

k
≥ 2 + Σi=2 Fi
k
= 1 + Σi=0 Fi
(si-2 ≥ Fi by inductive hypothesis)
 (F0 + F1 = 1) 



• • •
With a normal min-heap: O(|E|log |V| + |V| log |V|) = O(|E| log |V|) But here DecreaseKey is O(1) and ExtractMin is O(log |V|)
So Dijkstra’s algorithm is O(|E| + |V|log|V|) with a Fibonacci heap
Application of Fibonacci Heaps
 in Dijkstra’s Algorithm
Recall that Dijkstra’s algorithm performs:
O(|E|) DecreaseKey operations (each log |V| time in normal

min-heap)
O(|V|) ExtractMin operations (each log |V| time in normal min-

heap)


O(|E|) DecreaseKey’s, O(|V|) ExtractMin’s
Application of Fibonacci Heaps in Prim’s Algorithm
Similar to Dijkstra’s algorithm, this is then


O(|E| + |V| log |V|) with Fibonacci heap



Conventional wisdom is that the constants in Fibonacci heap implementations tend to be too large to make them actually faster for typical values of N than binary heaps.
But constants and “typical values of N” are moving targets.

But are Fibonacci Heaps
practical?
Who is to say what optimizations people will find to bring

down the practical running time?
Who is to say what size of data you will need to process?
At least now you know what you’re getting yourself into if you implement it — or what you’re “buying” if you use someone else’s implementation




While binomial heaps keep things neat all the time, Fibonacci heaps leave things messy and only “clean up” on ExtractMin — good if the insertions really outnumber the extractions
Summary:
Mergeable Heaps
Binomial heaps have a few things in common with Fibonacci heaps: Collections of binomial trees
Min-heap property in each tree
A merge/consolidate operation that combines similar trees
Efficient union: O(log n) for binomial, O(1) for Fibonacci
Compared to O(n) time for merging binary heaps
• • •

Fibonacci heaps have great amortized asymptotic time, but be careful about using

them in practice