CSE 347 Analysis of Algorithms January 16, 2019 Homework 1
Instructor: Brendan Juba
Due: January 28, 2019, 11:59 PM
Reminder: You may work in groups and use outside sources. But, you must write up solutions in your own words and properly reference your sources for each problem. This includes listing your collaborators and properly citing any sources you use. Solutions to each problem must be electronically typeset and submitted online via Gradescope. Instructions appear in the E-Homework Guide: http://www.cse.wustl.edu/~bjuba/cse347/s19/ehomework/
1. Kleinberg & Tardos Chapter 5, question 3
2. Dimensionality reduction is a widely-used technique for accelerating computations on high- dimensional data, by replacing them with a lower-dimensional “summary” that preserves the key properties of the original data, typically the approximate distances of points. The key step in fast dimensionality reduction is multiplying a vector by a matrix from the following family of special matrices, H0, H1, . . . , Hk, . . .:
• H0 is the 1×1 matrix [1].
• Fork>0,Hk isthe2k×2k matrix
Hk−1 Hk−1 Hk = Hk−1 −Hk−1
What makes the method so fast is that, for a vector v of length n = 2k, the matrix-vector product Hkv can be computed in time O(nlogn). Show how to do this (and as usual, prove the correctness and time bound of your algorithm).
3. Kleinberg & Tardos Chapter 4, question 4
1