CS计算机代考程序代写 data structure chain algorithm Dynamic Tables

Dynamic Tables
Discussion

Discussion
• What about hash tables with closed addressing (e.g., chaining)?
• Sameidea,withslightchangeofdefinitionof⍺
• Expand/shrink table based on threshold of bucket sizes • Examples:
• Maintain per-bucket/per-table metadata with each insert/delete
• Sample the sizes of randomly selected buckets
• Possibleimprovement
• Resize each bucket independently
• Example:
Kevin Williams, Joe Foster, Athicha Srivirote, Ahmed Hassan, Joseph Tassarotti, Lewis Tseng, Roberto Palmieri “On Building Modular and Elastic Data Structures with Bulk Operations”, ICDCN’21
• What about hash tables with open addressing?
• Nodifferenceforexpansion
• Contractiondependsonthewaywehandledeletes

Discussion
• What about concurrent dynamic tables
• Although expansion cost is amortized, it blocks any concurrent
operation until it is finished. • Solutions?
• Amortize the actual resizing not its cost
• Allocate new table eagerly, move data to new table lazily
• Key idea: resizing is a structural change not a semantic change.
• Depends on implementation details
• Example: Yujie Liu, Kunlong Zhang, and Michael Spear “Dynamic-Sized Nonblocking Hash Tables”, PODC’14
• Freeze objects in old hash table
• A new operation (insert/delete/lookup) on a frozen object x first copies x to
• Another resize first completes the previous resize (hopefully by then most elements will be already moved).
the new hash table

CSE 440: Advanced Algorithms
Lecture 8: Fibonacci Heaps CLRS Ch. 19

Mergeable Heap
• MAKE-HEAP
• Creates and returns a new heap containing no elements.
• INSERT(H, x)
• Inserts element x, whose key has already been filled in, into heap H.
• MINIMUM(H)
• Returns a pointer to the element in heap H whose key is minimum.
• EXTRACT-MIN(H)
• Deletes the element from heap H whose key is minimum, returning
a pointer to the element. • UNION(H1,H2)
• Creates and returns a new heap that contains all the elements of heaps H1 and H2. Heaps H1 and H2 are “destroyed” by this operation.

Fibonacci Heaps
• Mergeable heaps with two extra operations • DECREASE-KEY(H, x, k)
• Assigns to element x within heap H the new key value k, which we assume to be no greater than its current key value
• DELETE(H, x)
• deletes element x from heap H.
We will ignore those operations until the very end

Fibonacci Vs Binary Heaps
• Fib heap pros
• Asymptotically better INSERT,
UNION, DECREASE-KEY
• work specially well when most of the Heap operations are NOT EXTRACT-MIN and DELETE
• Fib heap cons
• Amortized not worst-case cost?
• Hidden constant is high.

Structure of Fibonacci Heaps
• Collection of rooted trees (each is min-heap ordered)
• H:min points to the root of a tree containing the minimum key
• H.n is the number of nodes currently in H
• Uses (unordered) doubly linked lists for both root list and child list
• Key for O(1) insertion/deletion of nodes, and O(1) concatenation of two lists

Structure of Fibonacci Heaps
• Each node has two attributes • degree: number of children
• Not the same as size(x)
• mark: indicates whether node x has lost a child since the last time x was made the child of another node.
• Until we discuss DECREASE-KEY, it is always false

Structure of Fibonacci Heaps
• Potential Function
# of marked nodes # of trees in root list
• Reason will be clear later
• Informally for now:
• Initially0,alwaysnon-negativeafterwords
• Ignorem(H):itis0ifnoDECREASE-KEYiscalled. • t(H)willpayforthecostofEXTRACT-MIN

Structure of Fibonacci Heaps
• Maximum Degree
• Assume we know an upper bound D(n) on the maximum degree of any node
• We can prove that • D(n)=O(lgn)
• Informal discussion later • Proof skipped

Operations
• Main idea
• Delay work as long as possible
• When the hard work comes, exploit it to restructure the Fibonacci heap.
• MAKE-HEAP
• INSERT(H, x)
• MINIMUM(H)
• EXTRACT-MIN(H) • UNION(H1,H2)

MAKE-HEAP
• Allocate and return the Fibonacci heap object H, where H.n = 0 and H.min = NIL;
• O(1) actual cost
• Potential does not change (it is actually 0) • Therefore, amortized cost is O(1)

INSERT(H, x)
• insert x into H’s root list, increment H.n, and update H.min if necessary.
• O(1) actual cost
• Change in potential = 1
• Therefore, amortized cost is O(1)

UNION(H1,H2)
• Concatenate their root lists, select its min as the smallest of the two minimums.
• O(1) actual cost
• Change in potential
• Therefore, amortized cost is O(1)

MINIMUM(H)
• Return H.min
• O(1) actual cost
• Potential does not change
• Therefore, O(1) amortized cost

EXTRACT-MIN
• The real work!!!!
Not the correct min yet
will be fixed during consolidation
Lines 1-9 (before consolidate)
Actual cost of those steps is O(D(n))

EXTRACT-MIN:
The CONSOLIDATE procedure
• Objective
• Reach a state at which every root in the root list has a
distinct degree value • Methodology
• Maintains an array A[0.. D(n)]
• A[k]: root node of degree k (initially NIL)
• Traverses root list: from H.min to the right • Merge subtrees of same degree
• Search for the new min in A[0.. D(n)]

EXTRACT-MIN:
The CONSOLIDATE procedure

EXTRACT-MIN:
The CONSOLIDATE procedure
• Analysis
• Actual work
• O(D(n)) work
• Adding the children of extracted min to root list
• Searching for the new min in A[0..D(n)] • O(D(n) + t(H)) work
• Scanning the root list during consolidation
• Aggregate analysis: each node is visited once.
• Change in potential
• ((D(n)+1)+2m(H)) – (t(H)+2m(H)) =D(n)-t(H)+1
• Therefore, amortized cost is O(D(n))

Bounding Maximum Degree
• Why D(n) is O(lg n)??
• In other words: why max degree of any node is O(lg n)?
• This also will answer why we name it Fibonacci Heap!!!

Bounding Maximum Degree
• Notice the peculiar structure of each tree • Thisisnotanaccident:
• For any node x, if:
• y1, y2, …, yn are the children from x and
• y1waslinkedfirsttox,y2waslinkedsecond,…
• Then, at the time of linking: • y1.degree ≥ 0
• y2.degree≥1
• Y3.degree≥2
•…
1
3 2 2
1 (linking order)

Bounding Maximum Degree
• UsingthefollowingtwoLemmas:
• We can then prove that for any node x
• size(x) ≥ fx.degree+2 ≥ jx.degree (j is the golden ratio) • Takingbase-jlogarithms
• The maximum degree D(n) of any node is thus O(lg n)
f3 f2f1
f0 f2
f1
f0
f0
f0 f0
f0
f1
(Fibonacci numbers)
f1

DECREASE-KEY
• Assume a pointer to the node x • Search itself is not (efficiently)
supported in such data structures
• Procedure
• Decrease key
• If min-heap order is not violated, we are done
• Otherwise
• CUTxandaddittotherootlist
• Adjust metadata: decrease x’s parent degree, and clear the x’s mark
• (CASCADE)-CUTx’sparentifxis the second child cut from its parent
• Change H.min if needed

DECREASE-KEY
• Example 1: decrease 46 to 15
CUT 15 and mark 24 (its parent)

DECREASE-KEY
• Example 2: decrease 35 to 5
CUT 5 and CASCADE-CUT 26, 24 Then make 5 the new H.min

DECREASE-KEY
• Analysis
• Actual cost:
• O(1)+costofCASCADE-CUT
• AssumeccallstoCAASCADE-CUT
• Each call to CASCADE-CUT takes O(1)
time + recursion
• Therefore,actualcostisO(1)+O(c)
• Change in potential • Atmost
• whichis4-c
• Amortized cost
• O(1)+O(c)+4-c=O(1)

DELETE
• Straightforward
• Final note:
• Our analysis of max degree didn’t consider CUTs
• However, this will not make a difference
• Because each node can lose at most 1 child before being CUT itself.

Thanks!