CS计算机代考程序代写 data structure AVL algorithm COSC2123: Algorithms & Analysis – Transform & Conquer

COSC2123: Algorithms & Analysis – Transform & Conquer

COSC2123: Algorithms & Analysis
Transform & Conquer

Hoang MIT University
Email : sonhoang. .au

Lecture 7

(RMIT University) Transform & Conquer Lecture 7 1 / 84

Overview

Levitin – The design and analysis of algorithms
This week we will be covering the material from Chapter 6.

Learning outcomes:
• Understand the Transform-and-conquer algorithmic approach.
• Understand and apply:

• Instance simplication: Pre-sorting & Balanced search trees (AVL
trees)

• Representation change: Balanced search trees (2-3 trees) & heaps
and heapsort

• Problem reduction

(RMIT University) Transform & Conquer Lecture 7 2 / 84

Outline

1 Overview

2 Presorting

3 Balanced Search Trees: AVL Trees

4 Balanced Search Trees: 2-3 Trees

5 Heaps and Heapsort

6 Problem Reduction

7 Summary

(RMIT University) Transform & Conquer Lecture 7 3 / 84

Overview

1 Overview

2 Presorting

3 Balanced Search Trees: AVL Trees

4 Balanced Search Trees: 2-3 Trees

5 Heaps and Heapsort

6 Problem Reduction

7 Summary

(RMIT University) Transform & Conquer Lecture 7 4 / 84

Transform and Conquer

Idea: Some problems are easier/simpler to solve after they are first
transformed to another form.

This group of techniques can be broken into the following
classifications:
• instance simplification – transform to a simpler or more

convenient instance of the same problem.
• representation change – transform to a different representation

of the same problem instance.
• problem reduction – transform to an instance of a different

problem for which an algorithm is already available.

(RMIT University) Transform & Conquer Lecture 7 5 / 84

Overview

1 Overview

2 Presorting

3 Balanced Search Trees: AVL Trees

4 Balanced Search Trees: 2-3 Trees

5 Heaps and Heapsort

6 Problem Reduction

7 Summary

(RMIT University) Transform & Conquer Lecture 7 6 / 84

Instance simplification – Presorting

Presorting : Many problems involving arrays are easier when the
array is sorted.

General approach of presorting based algorithms:
1 Transform: Sort the array
2 Conquer: Solve the transformed problem instance, taking

advantage of the array being sorted

Examples:
• Searching
• Computing the median (selection problem).
• Checking if all elements are distinct (element uniqueness).

(RMIT University) Transform & Conquer Lecture 7 7 / 84

Instance simplification – Presorting

Presorting : Many problems involving arrays are easier when the
array is sorted.

General approach of presorting based algorithms:
1 Transform: Sort the array
2 Conquer: Solve the transformed problem instance, taking

advantage of the array being sorted

Examples:
• Searching
• Computing the median (selection problem).
• Checking if all elements are distinct (element uniqueness).

(RMIT University) Transform & Conquer Lecture 7 7 / 84

Sort efficiency?

Sorting efficiency
Recall many of the sorting algorithms we have seen so far has
O(n log2(n)) worst cases to sort an array of size n.

• The efficiency for all presorting algorithms is bound by the cost of
sorting. If we can amortise this cost then it is often worth the effort.

(RMIT University) Transform & Conquer Lecture 7 8 / 84

Search in a sorted list

Problem : Search for a key k in a array A[0 . . . n − 1].

Presorting-based algorithm :
1 Sort the array using an efficient sorting algorithm.
2 Apply binary search.

Efficiency : O(n log n) +O(log n) = O(n log n)
Worst than brute-force sequential search, but if the array is static and
search is performed many times (amortised), then it may be worth the
extra effort of presorting.

(RMIT University) Transform & Conquer Lecture 7 9 / 84

Search in a sorted list

Problem : Search for a key k in a array A[0 . . . n − 1].
Presorting-based algorithm :

1 Sort the array using an efficient sorting algorithm.
2 Apply binary search.

Efficiency : O(n log n) +O(log n) = O(n log n)
Worst than brute-force sequential search, but if the array is static and
search is performed many times (amortised), then it may be worth the
extra effort of presorting.

(RMIT University) Transform & Conquer Lecture 7 9 / 84

Search in a sorted list

Problem : Search for a key k in a array A[0 . . . n − 1].
Presorting-based algorithm :

1 Sort the array using an efficient sorting algorithm.
2 Apply binary search.

Efficiency : O(n log n) +O(log n) = O(n log n)

Worst than brute-force sequential search, but if the array is static and
search is performed many times (amortised), then it may be worth the
extra effort of presorting.

(RMIT University) Transform & Conquer Lecture 7 9 / 84

Search in a sorted list

Problem : Search for a key k in a array A[0 . . . n − 1].
Presorting-based algorithm :

1 Sort the array using an efficient sorting algorithm.
2 Apply binary search.

Efficiency : O(n log n) +O(log n) = O(n log n)
Worst than brute-force sequential search, but if the array is static and
search is performed many times (amortised), then it may be worth the
extra effort of presorting.

(RMIT University) Transform & Conquer Lecture 7 9 / 84

Presorting – Element Uniqueness

Problem : Given an array of unsorted items, determine if all items in
the array are distinct.

Presorting-based algorithm :
1 Sort the array using an efficient sorting algorithm.
2 Scan the array to check all pairs of adjacent elements.

Efficiency : ? Θ(n log n) + Θ(n) = Θ(n log n)

The brute force algorithm compares all pairs of unsorted elements for
an efficiency of Θ(n2).

(RMIT University) Transform & Conquer Lecture 7 10 / 84

Presorting – Element Uniqueness

Problem : Given an array of unsorted items, determine if all items in
the array are distinct.

Presorting-based algorithm :
1 Sort the array using an efficient sorting algorithm.
2 Scan the array to check all pairs of adjacent elements.

Efficiency : ? Θ(n log n) + Θ(n) = Θ(n log n)

The brute force algorithm compares all pairs of unsorted elements for
an efficiency of Θ(n2).

(RMIT University) Transform & Conquer Lecture 7 10 / 84

Presorting – Element Uniqueness

Problem : Given an array of unsorted items, determine if all items in
the array are distinct.

Presorting-based algorithm :
1 Sort the array using an efficient sorting algorithm.
2 Scan the array to check all pairs of adjacent elements.

Efficiency : ?

Θ(n log n) + Θ(n) = Θ(n log n)

The brute force algorithm compares all pairs of unsorted elements for
an efficiency of Θ(n2).

(RMIT University) Transform & Conquer Lecture 7 10 / 84

Presorting – Element Uniqueness

Problem : Given an array of unsorted items, determine if all items in
the array are distinct.

Presorting-based algorithm :
1 Sort the array using an efficient sorting algorithm.
2 Scan the array to check all pairs of adjacent elements.

Efficiency : ? Θ(n log n) + Θ(n) = Θ(n log n)

The brute force algorithm compares all pairs of unsorted elements for
an efficiency of Θ(n2).

(RMIT University) Transform & Conquer Lecture 7 10 / 84

Presorting – Element Uniqueness

Problem : Given an array of unsorted items, determine if all items in
the array are distinct.

Presorting-based algorithm :
1 Sort the array using an efficient sorting algorithm.
2 Scan the array to check all pairs of adjacent elements.

Efficiency : ? Θ(n log n) + Θ(n) = Θ(n log n)

The brute force algorithm compares all pairs of unsorted elements for
an efficiency of Θ(n2).

(RMIT University) Transform & Conquer Lecture 7 10 / 84

Is presorting worth the effort?

Problem
How many searches in a list are necessary to justify the time spent on
presorting an array of 103 elements? What about 106 elements?
(Assume merge sort and binary search are used)

• Presorting:
• Merge sort uses n log2 n comparisons on average.
• Binary search uses log2 n comparisons on average.

• Linear search in an unsorted array uses n/2 comparisons on
average.

• We want to determine how may searches (k ) before the presorting
is faster than a non-presort method like linear search.
• Time for presort (n, k) ≤ Time for linear (n, k)
• n log2 n + k log2 n ≤ kn/2
• k ≥ n log2 nn/2−log2 n

• substituting n = 103, k = 21.
• substituting n = 106, k = 40. (Is the computation precise?)

(RMIT University) Transform & Conquer Lecture 7 11 / 84

Is presorting worth the effort?

Problem
How many searches in a list are necessary to justify the time spent on
presorting an array of 103 elements? What about 106 elements?
(Assume merge sort and binary search are used)

• Presorting:
• Merge sort uses n log2 n comparisons on average.
• Binary search uses log2 n comparisons on average.

• Linear search in an unsorted array uses n/2 comparisons on
average.

• We want to determine how may searches (k ) before the presorting
is faster than a non-presort method like linear search.
• Time for presort (n, k) ≤ Time for linear (n, k)
• n log2 n + k log2 n ≤ kn/2
• k ≥ n log2 nn/2−log2 n

• substituting n = 103, k = 21.
• substituting n = 106, k = 40. (Is the computation precise?)

(RMIT University) Transform & Conquer Lecture 7 11 / 84

Is presorting worth the effort?

Problem
How many searches in a list are necessary to justify the time spent on
presorting an array of 103 elements? What about 106 elements?
(Assume merge sort and binary search are used)

• Presorting:
• Merge sort uses n log2 n comparisons on average.
• Binary search uses log2 n comparisons on average.

• Linear search in an unsorted array uses n/2 comparisons on
average.

• We want to determine how may searches (k ) before the presorting
is faster than a non-presort method like linear search.
• Time for presort (n, k) ≤ Time for linear (n, k)
• n log2 n + k log2 n ≤ kn/2
• k ≥ n log2 nn/2−log2 n

• substituting n = 103, k = 21.
• substituting n = 106, k = 40. (Is the computation precise?)

(RMIT University) Transform & Conquer Lecture 7 11 / 84

Is presorting worth the effort?

Problem
How many searches in a list are necessary to justify the time spent on
presorting an array of 103 elements? What about 106 elements?
(Assume merge sort and binary search are used)

• Presorting:
• Merge sort uses n log2 n comparisons on average.
• Binary search uses log2 n comparisons on average.

• Linear search in an unsorted array uses n/2 comparisons on
average.
• We want to determine how may searches (k ) before the presorting

is faster than a non-presort method like linear search.
• Time for presort (n, k) ≤ Time for linear (n, k)

• n log2 n + k log2 n ≤ kn/2
• k ≥ n log2 nn/2−log2 n

• substituting n = 103, k = 21.
• substituting n = 106, k = 40. (Is the computation precise?)

(RMIT University) Transform & Conquer Lecture 7 11 / 84

Is presorting worth the effort?

Problem
How many searches in a list are necessary to justify the time spent on
presorting an array of 103 elements? What about 106 elements?
(Assume merge sort and binary search are used)

• Presorting:
• Merge sort uses n log2 n comparisons on average.
• Binary search uses log2 n comparisons on average.

• Linear search in an unsorted array uses n/2 comparisons on
average.
• We want to determine how may searches (k ) before the presorting

is faster than a non-presort method like linear search.
• Time for presort (n, k) ≤ Time for linear (n, k)
• n log2 n + k log2 n ≤ kn/2

• k ≥ n log2 nn/2−log2 n
• substituting n = 103, k = 21.
• substituting n = 106, k = 40. (Is the computation precise?)

(RMIT University) Transform & Conquer Lecture 7 11 / 84

Is presorting worth the effort?

Problem
How many searches in a list are necessary to justify the time spent on
presorting an array of 103 elements? What about 106 elements?
(Assume merge sort and binary search are used)

• Presorting:
• Merge sort uses n log2 n comparisons on average.
• Binary search uses log2 n comparisons on average.

• Linear search in an unsorted array uses n/2 comparisons on
average.
• We want to determine how may searches (k ) before the presorting

is faster than a non-presort method like linear search.
• Time for presort (n, k) ≤ Time for linear (n, k)
• n log2 n + k log2 n ≤ kn/2
• k ≥ n log2 nn/2−log2 n

• substituting n = 103, k = 21.
• substituting n = 106, k = 40. (Is the computation precise?)

(RMIT University) Transform & Conquer Lecture 7 11 / 84

Is presorting worth the effort?

Problem
How many searches in a list are necessary to justify the time spent on
presorting an array of 103 elements? What about 106 elements?
(Assume merge sort and binary search are used)

• Presorting:
• Merge sort uses n log2 n comparisons on average.
• Binary search uses log2 n comparisons on average.

• Linear search in an unsorted array uses n/2 comparisons on
average.
• We want to determine how may searches (k ) before the presorting

is faster than a non-presort method like linear search.
• Time for presort (n, k) ≤ Time for linear (n, k)
• n log2 n + k log2 n ≤ kn/2
• k ≥ n log2 nn/2−log2 n

• substituting n = 103, k = 21.
• substituting n = 106, k = 40. (Is the computation precise?)
(RMIT University) Transform & Conquer Lecture 7 11 / 84

Overview

1 Overview

2 Presorting

3 Balanced Search Trees: AVL Trees

4 Balanced Search Trees: 2-3 Trees

5 Heaps and Heapsort

6 Problem Reduction

7 Summary

(RMIT University) Transform & Conquer Lecture 7 12 / 84

Balanced Search Trees

• We study two methods to balance search trees. Why balanced
trees are desirable?

• Recall: The worst-case performance of simple binary trees for its
operations is dependent on the height of the tree.
• As a result, a great deal of research effort has been invested in

keeping binary search trees balanced and of minimum height.
• Two approaches:

• AVL trees: An example of the instance simplification approaches,
whereby rebalancing (transformation) is performed when inserting
or deleting elements.

• 2-3 trees: An example of the representation change approaches,
by allowing more than one key per node.

(RMIT University) Transform & Conquer Lecture 7 13 / 84

Balanced Search Trees

• We study two methods to balance search trees. Why balanced
trees are desirable?
• Recall: The worst-case performance of simple binary trees for its

operations is dependent on the height of the tree.

• As a result, a great deal of research effort has been invested in
keeping binary search trees balanced and of minimum height.

• Two approaches:
• AVL trees: An example of the instance simplification approaches,

whereby rebalancing (transformation) is performed when inserting
or deleting elements.

• 2-3 trees: An example of the representation change approaches,
by allowing more than one key per node.

(RMIT University) Transform & Conquer Lecture 7 13 / 84

Balanced Search Trees

• We study two methods to balance search trees. Why balanced
trees are desirable?
• Recall: The worst-case performance of simple binary trees for its

operations is dependent on the height of the tree.
• As a result, a great deal of research effort has been invested in

keeping binary search trees balanced and of minimum height.

• Two approaches:
• AVL trees: An example of the instance simplification approaches,

whereby rebalancing (transformation) is performed when inserting
or deleting elements.

• 2-3 trees: An example of the representation change approaches,
by allowing more than one key per node.

(RMIT University) Transform & Conquer Lecture 7 13 / 84

Balanced Search Trees

• We study two methods to balance search trees. Why balanced
trees are desirable?
• Recall: The worst-case performance of simple binary trees for its

operations is dependent on the height of the tree.
• As a result, a great deal of research effort has been invested in

keeping binary search trees balanced and of minimum height.
• Two approaches:

• AVL trees: An example of the instance simplification approaches,
whereby rebalancing (transformation) is performed when inserting
or deleting elements.

• 2-3 trees: An example of the representation change approaches,
by allowing more than one key per node.

(RMIT University) Transform & Conquer Lecture 7 13 / 84

AVL Trees

Definition
An AVL tree is a:
• binary search tree
• for every node, the difference between the heights of the left and

right subtree can at most be 1. This includes the root node, hence
AVL trees are guaranteed to be almost balanced.

The height difference is called the balance factor.

balance factor(node v) = height(left subtree of v) – height(right subtree
of v)

The height of an empty tree is defined as −1.

(RMIT University) Transform & Conquer Lecture 7 14 / 84

AVL Trees

Definition
An AVL tree is a:
• binary search tree
• for every node, the difference between the heights of the left and

right subtree can at most be 1. This includes the root node, hence
AVL trees are guaranteed to be almost balanced.

The height difference is called the balance factor.

balance factor(node v) = height(left subtree of v) – height(right subtree
of v)

The height of an empty tree is defined as −1.

(RMIT University) Transform & Conquer Lecture 7 14 / 84

AVL Trees

Definition
An AVL tree is a:
• binary search tree
• for every node, the difference between the heights of the left and

right subtree can at most be 1. This includes the root node, hence
AVL trees are guaranteed to be almost balanced.

The height difference is called the balance factor.

balance factor(node v) = height(left subtree of v) – height(right subtree
of v)

The height of an empty tree is defined as −1.

(RMIT University) Transform & Conquer Lecture 7 14 / 84

AVL Trees

10

5

4

1

7

8

20

12

1

0

1

0

-1

0

1

0

(a) an AVL tree

10

5

4

1

7

8

20

2

0

1

0

-1

0

0

(b) not an AVL tree

(RMIT University) Transform & Conquer Lecture 7 15 / 84

AVL Trees

10

5

4

1

7

8

20

12

1

0

1

0

-1

0

1

0

(a) an AVL tree

10

5

4

1

7

8

20

2

0

1

0

-1

0

0

(b) not an AVL tree

(RMIT University) Transform & Conquer Lecture 7 15 / 84

AVL trees – Insertion

Let w denote the new node to insert.
1 Insert w using BST insertion.

2 Update balance factors and check if need to rebalance tree:
• From w , updating the balance factor of each node as we traverse

back up to root.
3 If tree is AVL-tree unbalanced, perform rotation transformations to

make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).
• May need to update balance factor for the parent node (and their

ancestors) of the subtree where the rotations occur.

(RMIT University) Transform & Conquer Lecture 7 16 / 84

AVL trees – Insertion

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

3 If tree is AVL-tree unbalanced, perform rotation transformations to
make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).
• May need to update balance factor for the parent node (and their

ancestors) of the subtree where the rotations occur.

(RMIT University) Transform & Conquer Lecture 7 16 / 84

AVL trees – Insertion

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

3 If tree is AVL-tree unbalanced, perform rotation transformations to
make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).
• May need to update balance factor for the parent node (and their

ancestors) of the subtree where the rotations occur.

(RMIT University) Transform & Conquer Lecture 7 16 / 84

AVL trees – Insertion

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

3 If tree is AVL-tree unbalanced, perform rotation transformations to
make avl tree balanced again.

• We repair the tree at the node where we first detect imbalance
(imbalance node that is closest to the inserted leaf node).

• May need to update balance factor for the parent node (and their
ancestors) of the subtree where the rotations occur.

(RMIT University) Transform & Conquer Lecture 7 16 / 84

AVL trees – Insertion

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

3 If tree is AVL-tree unbalanced, perform rotation transformations to
make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).

• May need to update balance factor for the parent node (and their
ancestors) of the subtree where the rotations occur.

(RMIT University) Transform & Conquer Lecture 7 16 / 84

AVL trees – Insertion

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

3 If tree is AVL-tree unbalanced, perform rotation transformations to
make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).
• May need to update balance factor for the parent node (and their

ancestors) of the subtree where the rotations occur.

(RMIT University) Transform & Conquer Lecture 7 16 / 84

AVL trees – Insertion Example

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

Insert 2:

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

Next: Introduce the rotation operations, then how to apply them to do
rebalancing.

(RMIT University) Transform & Conquer Lecture 7 17 / 84

AVL trees – Insertion Example

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

Insert 2:

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

Next: Introduce the rotation operations, then how to apply them to do
rebalancing.

(RMIT University) Transform & Conquer Lecture 7 17 / 84

AVL trees – Insertion Example

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

Insert 2:

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

Next: Introduce the rotation operations, then how to apply them to do
rebalancing.

(RMIT University) Transform & Conquer Lecture 7 17 / 84

AVL trees – Insertion Example

Let w denote the new node to insert.
1 Insert w using BST insertion.
2 Update balance factors and check if need to rebalance tree:

• From w , updating the balance factor of each node as we traverse
back up to root.

Insert 2:

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

Next: Introduce the rotation operations, then how to apply them to do
rebalancing.

(RMIT University) Transform & Conquer Lecture 7 17 / 84

AVL trees – Rotations

Left Rotation:

L-rotation

Right Rotation:

R-rotation

(RMIT University) Transform & Conquer Lecture 7 18 / 84

AVL trees – Rotations

Left Rotation:

L-rotation

Right Rotation:

R-rotation

(RMIT University) Transform & Conquer Lecture 7 18 / 84

AVL trees – Rotations

LR (left-right) rotation:

L-rotation R-rotation

(RMIT University) Transform & Conquer Lecture 7 19 / 84

AVL trees – Rotations

RL (right-left) rotation:

R-rotation L-rotation

(RMIT University) Transform & Conquer Lecture 7 20 / 84

AVL trees – Insertion

Step 3: If tree is AVL-tree unbalanced, perform rotation
transformations to make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).

Four different cases to consider, each of which uses one of the
rotations to rebalance tree.

In the following, node r is the first imbalanced node (in terms of
balance factor) detected when traversing from inserted node.

(RMIT University) Transform & Conquer Lecture 7 21 / 84

AVL trees – Insertion

Step 3: If tree is AVL-tree unbalanced, perform rotation
transformations to make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).

Four different cases to consider, each of which uses one of the
rotations to rebalance tree.

In the following, node r is the first imbalanced node (in terms of
balance factor) detected when traversing from inserted node.

(RMIT University) Transform & Conquer Lecture 7 21 / 84

AVL trees – Insertion

Step 3: If tree is AVL-tree unbalanced, perform rotation
transformations to make avl tree balanced again.
• We repair the tree at the node where we first detect imbalance

(imbalance node that is closest to the inserted leaf node).

Four different cases to consider, each of which uses one of the
rotations to rebalance tree.

In the following, node r is the first imbalanced node (in terms of
balance factor) detected when traversing from inserted node.

(RMIT University) Transform & Conquer Lecture 7 21 / 84

AVL trees – Insertion Scenarios

T
1

T
2

T
3 T

1
T

2

T
3

v

vr

r

Case 1: Inserted node is in right subtree of node v, where:
• node v is the right child of node r.

Operation: Single L-rotation, rotating nodes r and v.

(RMIT University) Transform & Conquer Lecture 7 22 / 84

AVL trees – Insertion Scenarios

(a) Before rotation. (b) After rotation.

Case 1: Inserted node is in right subtree of node v.

Operation: Single L-rotation, rotating nodes r and v.
(RMIT University) Transform & Conquer Lecture 7 23 / 84

AVL trees – Insertion Scenarios

T
1

T
2

T
3

r

c r

c

T
1

T
2

T
3

Case 2: Inserted node is in left subtree of node c, where:
• node c is the left child of node r.

Operation: Single R-rotation, rotating nodes r and c.

(RMIT University) Transform & Conquer Lecture 7 24 / 84

AVL trees – Insertion Scenarios

(c) Before rotation. (d) After rotation.

Case 2: Inserted node is in left subtree of node c.

Operation: Single R-rotation, rotating nodes r and c.
(RMIT University) Transform & Conquer Lecture 7 25 / 84

AVL trees – Insertion Scenarios

r

T
1

T
2

T
3

T
4

c

T
1

T
2

T
3

T
4

r

g

g

or

or

c

Case 3: Inserted node is in one of the subtrees of node g where:
• nodes r, c and g form a r-left-right children subtree.

Operation: Double LR-rotation, rotating nodes r, c and g.

(RMIT University) Transform & Conquer Lecture 7 26 / 84

AVL trees – Insertion Scenarios

(e) Before rotation. (f) After rotation.

Case 3: Inserted node is in one of the subtrees of node g.

Operation: Double LR-rotation, rotating nodes r, c and g.

(RMIT University) Transform & Conquer Lecture 7 27 / 84

AVL trees – Insertion Scenarios

T
1

T
2

T
3

T
4

T
1

T
2 T 3

T
4

or

or

r

w

v

r w

v

Case 4: Inserted node is in one of the subtrees of node v where:
• nodes r, w and v form a r-right-left children subtree.

Operation: Double RL-rotation, rotating nodes r, w and v.

(RMIT University) Transform & Conquer Lecture 7 28 / 84

AVL trees – Insertion Scenarios

(g) Before rotation. (h) After rotation.

Case 4: Inserted node is in one of the subtrees of node v.

Operation: Double RL-rotation, rotating nodes r, w and v.

(RMIT University) Transform & Conquer Lecture 7 29 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0

L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees Insertion Example

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

5

0

5

6

-1

0

5

6

8

-2

-1

0
L(5)

6

5 8

0

0

0

6

5

3

8

0

1

1

0

6

5

3

2

8

0

1

2

2

0

R(5)

6

3

2 5

8

0

0

0

1

0

(RMIT University) Transform & Conquer Lecture 7 30 / 84

AVL trees

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

6

3

2 5

4

8

0

-1

0

1

2

0

LR(6)

5

3

2 4

6

8

0

0

0

0

-1

0

5

3

2 4

6

8

7

0

0

0

-1

-2

0

1
RL(6)

5

3

2 4

7

6 8

0

0

0

0

0

0

0

(RMIT University) Transform & Conquer Lecture 7 31 / 84

AVL trees

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

6

3

2 5

4

8

0

-1

0

1

2

0

LR(6)

5

3

2 4

6

8

0

0

0

0

-1

0

5

3

2 4

6

8

7

0

0

0

-1

-2

0

1
RL(6)

5

3

2 4

7

6 8

0

0

0

0

0

0

0

(RMIT University) Transform & Conquer Lecture 7 31 / 84

AVL trees

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

6

3

2 5

4

8

0

-1

0

1

2

0

LR(6)

5

3

2 4

6

8

0

0

0

0

-1

0

5

3

2 4

6

8

7

0

0

0

-1

-2

0

1
RL(6)

5

3

2 4

7

6 8

0

0

0

0

0

0

0

(RMIT University) Transform & Conquer Lecture 7 31 / 84

AVL trees

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

6

3

2 5

4

8

0

-1

0

1

2

0

LR(6)

5

3

2 4

6

8

0

0

0

0

-1

0

5

3

2 4

6

8

7

0

0

0

-1

-2

0

1

RL(6)

5

3

2 4

7

6 8

0

0

0

0

0

0

0

(RMIT University) Transform & Conquer Lecture 7 31 / 84

AVL trees

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

6

3

2 5

4

8

0

-1

0

1

2

0

LR(6)

5

3

2 4

6

8

0

0

0

0

-1

0

5

3

2 4

6

8

7

0

0

0

-1

-2

0

1
RL(6)

5

3

2 4

7

6 8

0

0

0

0

0

0

0

(RMIT University) Transform & Conquer Lecture 7 31 / 84

AVL trees

Construct an AVL tree for the sequence 5,6,8,3,2,4,7

6

3

2 5

4

8

0

-1

0

1

2

0

LR(6)

5

3

2 4

6

8

0

0

0

0

-1

0

5

3

2 4

6

8

7

0

0

0

-1

-2

0

1
RL(6)

5

3

2 4

7

6 8

0

0

0

0

0

0

0

(RMIT University) Transform & Conquer Lecture 7 31 / 84

AVL trees – Analysis

• h ≤ 1.4404 log2(n + 2)− 1.3277 (h = height)
• Average height (found empirically): 1.01 log2 n + 0.1 for large n.
• SEARCH and INSERT are O(log n)
• DELETE is more difficult, but still in O(log n)
• rebalance (rotations) in O(log n)

• Disadvantages: Rotations are frequent, and implementation is
complex.

(RMIT University) Transform & Conquer Lecture 7 32 / 84

AVL trees – Analysis

• h ≤ 1.4404 log2(n + 2)− 1.3277 (h = height)
• Average height (found empirically): 1.01 log2 n + 0.1 for large n.
• SEARCH and INSERT are O(log n)
• DELETE is more difficult, but still in O(log n)
• rebalance (rotations) in O(log n)
• Disadvantages: Rotations are frequent, and implementation is

complex.

(RMIT University) Transform & Conquer Lecture 7 32 / 84

AVL trees – Example

https://www.cs.usfca.edu/~galles/visualization/AVLtree.html

(RMIT University) Transform & Conquer Lecture 7 33 / 84

https://www.cs.usfca.edu/~galles/visualization/AVLtree.html

Overview

1 Overview

2 Presorting

3 Balanced Search Trees: AVL Trees

4 Balanced Search Trees: 2-3 Trees

5 Heaps and Heapsort

6 Problem Reduction

7 Summary

(RMIT University) Transform & Conquer Lecture 7 34 / 84

2-3 Trees – Multiway Search Trees

Definition
A multiway search tree is a search tree which allows more than one
element per node.
A node of a search tree is called n-node if it contains n − 1 ordered
elements, dividing the entire element range into n intervals.

k1 < k2 < · · · < kn−1 < k1 (k1, k2) (kn−2, kn−1) > kn−1

(RMIT University) Transform & Conquer Lecture 7 35 / 84

2-3 Trees

Definition
A 2-3 tree is a search tree which mixes 2-nodes and 3-nodes to keep
the tree height-balanced (all leaves are on the same level).

K
1
, K

2

(K
1
, K

2
) >K

2
KK

2
K