CS计算机代考程序代写 AI algorithm data structure Lecture Note 05 EECS 4101/5101 Instructor: Andy Mirzaian

Lecture Note 05 EECS 4101/5101 Instructor: Andy Mirzaian
SKEW HEAPS: Self-Adjusting Heaps
In this handout we describe the skew heap data structure, a self-adjusting form of heap related to the leftist heap of Crane and Knuth. Skew heaps, in contrast to leftist heaps, use less space and are easier to implement, and yet in the amortized sense they are as efficient, to within a constant factor, as leftist heaps.
We will consider the following priority queue operations:
makeheap(h): findmin(h):
insert(k , h): deletemin(h): union( h1 , h2):
Create a new, empty heap, named h.
Return a minimum key in h. If h is empty, return the special key “null”.
(This operation does not change the set of keys stored in h.)
Insert key k in heap h. (Duplicate keys are allowed in the heap. That is, the
key to be inserted is treated as “new”.)
Delete a minimum key from heap h and return it. If the heap is initially empty, return null.
Return the heap formed by joining (i.e., taking the union of) heaps h1 and h2. This operation distroys h1 and h2.
THE DATA STRUCTURE
There are several ways to implement heaps in a self-adjusting fashion. The one we shall discuss is called skew heaps as proposed by Sleator and Tarjan, and is analogous to leftist heaps. A skew heap is a heap-ordered binary tree. That is, it is a binary tree with one key in each node so that for each node x other that the root, the key at node x is no less than the key at the parent of x. To represent such a tree we store in each node x its associated key, denoted key(x) and two pointers left(x) and right(x), to its left child and right child, respectively. If x has no left child we define left(x)=Λ; if x has no right child we define right(x)=Λ. Access to the tree is by a pointer to its root; we represent an empty tree by a pointer to Λ.
With this representation we can carry out the various heap operations as follows. We perform makeheap(h) in O(1) time by initializing h to Λ. Since heap order implies that the root is a minimum key in the tree, we can carry out findmin(h) in O(1) time by returning the key at the root; returning null if the heap is empty. We perform insert and deletemin using union. To carry out insert(k , h), we make k into a one-node heap and Union it with h. To carry out deletemin(h), if h is not empty we replace h by the Union of its left and right sub- trees and return the key at the original root. (If h is originally empty we return null.)
To perform union( h1 , h2), we form a single tree by traversing down the right paths of h1 and h2, merging them into a single right path with keys in nondecreasing order. First assume the left subtrees of nodes along the merge path do not change. (See Figure 1(a).) The time for the Union operation is bounded by a constant times the length of the merge path. To make Union efficient, we must keep right paths short. In leftist heaps this is done

-2-
by maintaining the invariant that, for any node x, the right path descending from x is a short- est path down to a missing node. Maintaining this invariant requires storing at every node the length of a shortest path down to a missing node; after the merge we walk back up the merge path, updating the shortest path lengths and swapping left and right children as neces- sary to maintain the leftist property. The length of the right path in a leftist heap of n nodes is at most log n, implying an O( log n) worst-case time bound for each of the heap opera- tions, where n is the number of nodes in the heap or heaps involved.
51
16
9 1812
13 21 40 30 15 25
14
(a)
18 9 13
51
40
1
6
15
30
12
14
(b)
21 25
1
6 51
12
9
13
18
40
15
14
30
21 25
Figure 1:
A Union of two skew heaps.
(a) Merge of the right paths.
(b) Swapping of children along the path formed by the merge.
In our self-adjusting version of this data structure, we perform the Union operation by merging the right paths of the two trees and then swapping the left and right children of every node on the merge path except the lowest. (See Figure 1(b).) This makes the poten- tially long right path formed by the merge into a left path. We call the resulting data struc- ture a skew heap.

-3-
Exercise 1: Write a simple recursive algorithm for the Union operation on skew heaps.
THE AMORTIZED ANALYSIS
In our analysis of skew heaps we shall use the following general approach. We asso-
ciate with each collection S of skew heaps a real number Φ(S) called the potential of S. For
any sequence of m operations with running times t1 , t2 ,…, tm, we define the amortized time
ai of the ith operation to be ai =ti +Φi −Φi−1, where, Φi, for i=1,2,…,m, is the potential of
the skew heaps after the ith operation and Φ is the potential of the skew heaps before the 0
first operation. The total running time of the sequence of the operations is then
mmm
Σ t i = Σ ( a i − Φ i + Φ i −1 ) = Φ 0 − Φ m + Σ a i i=1 i=1 i=1
That is, the total running time equals the total amortized time plus the net decrease in poten- tial from the initial to the final collection of the skew heaps. In our analysis of skew heaps the potential will be zero initially and will remain nonnegative. This implies that the total amortized time is an upper bound on the actual running time over the entire sequence of operations.
We shall define the potential of a single skew heap; the potential of a collection of skew heaps is the sum of the potentials of its members. Intuitively, a heap with high potential is one subject to unusually time-consuming operations; the extra time spent corresponds to a drop in potential.
Definition 1: For any node x in a binary tree, we define the weight of x, denoted wt(x), to be the number of descendents of x, including x itself.
We use weights to partition nonroot nodes into two classes:
Definition 2: A nonroot node x is heavy if wt(x) > wt( parent(x) ) / 2 and light otherwise. A root node is considered neither heavy nor light.
The following facts immediately follow from the above definitions. (Try to prove them yourself.)
Fact 1: Of the children of any node, at most one is heavy.
Fact 2: On any path from a node x down to a descendent y, there are at most log ( wt(x) / wt(y) )  light nodes, not counting x. In particular, any path in an n-node tree contains at most log n light nodes.
Definition 3: A nonroot node is called right if it is the right child of its parent; it is called left otherwise.
Definition 4: We define the potential of a skew heap to be the total number of right heavy nodes it contains.
Suppose we begin with no heaps and carry out a sequence of skew heap operations. The initial potential is zero and the final potential is nonnegative, so the total amortized time is an upper bound on the total actual time of the operations. The amortized time of a make- heap or findmin operation is O(1), since these operations require O(1) time and do not

(Check this.)
(See Figure 2.)
-4-
change the potential. The amortized time of the other operations depend on the amortized time for the union operation.
Consider a Union of two heaps h1 and h2, containing n1 and n2 nodes, respectively. Let n = n1 + n2 be the total number of nodes in the two heaps. As a measure of the time for Union, we shall charge one per node on the merge path. Thus the amortized time of Union is the number of nodes on the merge path plus the change in potential. By Fact 2, the num- ber of light nodes on the right paths of h1 and h2 is at most logn1  and logn2 , respec- tively. Thus the total number of light nodes on the two paths is at most 2logn−1.
# LIGHT
# HEAVY =
k
lg n # LIGHT lg n 12
# HEAVY = k 12
# LIGHT
lg n
# HEAVY = k
3
lg n
Figure 2: Analysis of right heavy nodes in union.
Let k1 and k2 be the number of heavy nodes on the right paths of h1 and h2, respec- tively, and let k3 be the number of nodes that become right heavy children of nodes on the merge path. By Fact 1, every node counted by k3 corresponds to a light node on the merge path. Thus Fact 2 implies that k3 ≤  log n .
The number of nodes on the merge path is at most 2+logn1 +k1 +logn2  +k2 ≤1 + 2 log n  + k1 + k2. (The first “2” counts the roots of h1 and h2.) Therefore the running time of the Union operation is t ≤1 + 2 log n  + k1 + k2. The change in potential caused by the Union is ∆Φ=k3 −k1 −k2 ≤logn−k1 −k2. Thus the amortized time of the Union, t + ∆Φ, is at most 3 log n  +1.
We summarize the above result in the following theorem.
Theorem: The amortized time of an insert, deletemin, or union skew heap operation is O( log n), where n is the number of keys in the heap or heaps involved in the operation. The amortized time of a makeheap or findmin operation is O(1).
Proof: The analysis above gives the bound for findmin, makeheap, and union; the bound for insert and deletemin follows immediately from that of union.

-5-
Bibliography:
To see further improvements to the basic structure described above, the interested reader is referred to:
Sleator, Tarjan, “Self-Adjusting Heaps,” SIAM J. Computing, 15(1), 1986, pp. 52-69.
Bernard Chazelle’s papers sited below describe a novel self adjusting priority queue that allows a controled measure of error in order to accelerate performance! He shows many applications, including the fastest deterministic minimum spanning tree algorithm todate: “The Soft Heap: An Approximate Priority Queue with Optimal Error Rate”, JACM 47(6), Nov. 2000, pp. 1012-1027.
“A Minimum Spanning Tree Algorithm with inverse-Ackermann Type Complexity” JACM 47(6), Nov. 2000, pp. 1028-1047.