CS代考 Analysis of Algorithms, I

Analysis of Algorithms, I
CSOR W4231

Computer Science Department

Copyright By PowCoder代写 加微信 powcoder

Columbia University
More dynamic programming: matrix chain multiplication

1 Matrix chain multiplication
2 A first attempt: brute-force
3 A second attempt: divide and conquer
4 Organizing DP computations

1 Matrix chain multiplication
2 A first attempt: brute-force
3 A second attempt: divide and conquer
4 Organizing DP computations

Matrix chain multiplication example
Example 1.
Input: matrices A1, A2, A3 of dimensions 6×1, 1×5, 5×2 Output:
􏰉 a way to compute the product A1A2A3 so that the number of arithmetic operations performed is minimized;
􏰉 the minimum number of arithmetic operations required.

Useful observations
􏰉 We do not want to compute the actual product.
􏰉 Matrix multiplication is associative but not commutative (in general). Hence a solution to our problem corresponds to a parenthesization of the product.
􏰉 We want the optimal parenthesization and its cost, that is, the parenthesization that minimizes the number of arithmetic operations, as well as that number.

Estimating #arithmetic operations
􏰉 Let A,B be matrices of dimensions m×n, n×p.
􏰉 LetC=AB.ThenCisanm×pmatrixsuchthat
cij =􏰆aik ·bkj.
⇒ cij requires n scalar multiplications, n − 1 additions
⇒ #arithmetic operations to compute cij is dominated by
#scalar multiplications
􏰉 Total #scalar multiplications to fill in C is mnp

Minimizing #scalar multiplications for A1A2A3
Input: A1, A2, A3 of dimensions 6 × 1, 1 × 5, 5 × 2 respectively Given a parenthesization of the input matrices, its cost is the total
# scalar multiplications to compute the product.
Two ways of computing A1A2A3:
1. (A1A2)A3: first compute A1A2, then multiply it by A3
􏰉 6 · 1 · 5 scalar multiplications for A1A2
􏰉 6 · 5 · 2 scalar multiplications for (A1 A2 )A3
⇒ 90 scalar multiplications in total
2. A1(A2A3): first compute A2A3, then multiply A1 by A2A3
􏰉 1 · 5 · 2 scalar multiplications for A2A3
􏰉 6 · 1 · 2 scalar multiplications for A1 (A2 A3 )
⇒ 22 scalar multiplications in total
Parenthesization A1(A2A3) improves over (A1A2)A3 by over 75%.

(Fully) Parenthesized products of matrices
Definition 2.
A product of matrices is fully parenthesized if it is
1. a single matrix; or
2. the product of two fully parenthesized matrices, surrounded by parentheses.
Examples: ((A1A2)A3) and (A1(A2A3)) are fully parenthesized.
Remark: we will henceforth refer to a full parenthesization simply as a parenthesization.

Matrix chain multiplication
Input: n matrices A1, A2, . . . , An, with dimensions pi−1 × pi, for 1 ≤ i ≤ n.
1. an optimal parenthesization of the input
(i.e., a way to compute A1 · · · An incurring minimum cost) 2. its cost
(i.e., total # scalar multiplications to compute A1 · · · An)
Example: the optimal parenthesization for Example 1 is (A1(A2A3)) and its cost is 22.

1 Matrix chain multiplication
2 A first attempt: brute-force
3 A second attempt: divide and conquer
4 Organizing DP computations

Brute-force approach
􏰉 A1, . . . , An are matrices of dimensions pi−1 × pi for 1 ≤ i ≤ n.
􏰉 ConsidertheproductA1···An.
􏰉 LetP(n)=#parenthesizationsoftheproductA1···An.
􏰉 ThenP(0)=0,P(1)=1,P(2)=1
􏰉 By Definition 2, for n > 2, every possible parenthesization of A1 · · · An can be decomposed into the product of two parenthesized subproducts for some 1 ≤ k ≤ n − 1:
((A1A2 ···Ak)(Ak+1 ···An))

Computing #possible parenthsizations
􏰉 Given k, the number of parenthesizations for the product ((A1A2 ···Ak)(Ak+1 ···An))
can be computed recursively: P(k)·P(n−k)
􏰉 There are n − 1 possible values for k. Hence
n−1 P(n)=􏰆P(k)·P(n−k), forn>1

Bounding P(n)
􏰉 We may obtain a crude yet sufficient for our purposes lower bound for P(n) as follows
P(n) ≥ P(1)·P(n−1)+P(2)·P(n−2)
≥ P(n−1)+P(n−2) (1)
􏰉 By strong induction on n, we can show that P(n) ≥ Fn, the n-th Fibonacci number.
􏰉 HenceP(n)=Ω(2n/2).
􏰉 In fact, P(n) = Ω(22n/n3/2) (e.g., see your textbook).
⇒ Brute force requires exponential time.

1 Matrix chain multiplication
2 A first attempt: brute-force
3 A second attempt: divide and conquer
4 Organizing DP computations

A second attempt: divide and conquer
Notation: A1,n is the optimal parenthesization of the product A1 · · · An, that is, the optimal way to compute this product.
By Definition 2, there exists 1 ≤ k∗ ≤ n − 1 such that A1,n may be decomposed as the product of two fully parenthesized subproducts:
A1,n = ((A1 ···Ak∗)(Ak∗+1 ···An))

Optimal substructure
Notation: Ai,j is the optimal parenthesization of the product Ai ···Aj.
There exists 1≤k∗ ≤n−1 such that
A1,n = (A1,k∗ · Ak∗+1,n).
That is, the optimal parenthesization of the input can be decomposed into the optimal parenthesizations of two subproblems.

The cost of multiplying two matrices
􏰉 Recall that matrix Ai has dimensions pi−1 × pi.
􏰉 Then (A1 ···Ak) is a p0 ×pk matrix;
􏰉 (Ak+1···An)isapk×pnmatrix.
⇒ The total #scalar multiplications required for multiplying matrix (A1 ···Ak) by matrix (Ak+1 ···An) is

Proof of Fact 3
􏰉 Ai,j istheoptimalparenthesizationofAi···Aj
􏰉 OPT(i,j) = cost of Ai,j = optimal cost to compute Ai ···Aj
Since A1,n = ((A1 ···Ak∗)(Ak∗+1 ···An)), its cost is given by OPT(1,n)=OPT(1,k∗)+OPT(k∗ +1,n)+p0pk∗pn,
􏰉 OPT(1,k∗),OPT(k∗+1,n)arethecostsforoptimally(why?)
computingA1···Ak∗,Ak∗+1···An respectively
􏰉 p0pk∗ pn is the fixed cost for multiplying (A1 · · · Ak∗ ) by (Ak∗+1 ···An).

Recursive computation of OPT(1,n)
Notation: OP T (1, n) = optimal cost for computing A1 · · · An 􏰉 Issue: we do not know k∗!
􏰉 Solution: consider every possible value of k. 􏰃 0
, if n = 1 , o.w.
OPT(1,n) =
OPT(1,k)+OPT(k+1,n)+p0pkpn
This recurrence gives rise to an exponential recursive algorithm. However we can use dynamic programming to obtain an efficient solution.

Introducing subproblems
Notation: OP T (i, j) = optimal cost for computing Ai · · · Aj
OPT(i,j) =
, if i = j , if iCS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com