CS计算机代考程序代写 python algorithm IV. Materials provided

IV. Materials provided

A zip file is provided for you on Carmen called HW3_Programming.zip. Download and
extract the files onto your computer.

You will use python 3 to run your solution.

You will play with the Swiss Roll data, MNIST (digit data), and other toy datasets. See
Swiss.png and Digits.png in the for_display folder.

V. Preliminaries

You will use NumPy (https://numpy.org/) extensively in this homework. NumPy a library
for the Python programming language, adding support for large, multi-dimensional arrays
and matrices, along with a large collection of high-level mathematical functions to
operate on these arrays. NumPy has many great functions and operations that will make
your implementation much easier.

If you are not familiar with Numpy, we recommend that you read this tutorial
(https://cs231n.github.io/python-numpy-tutorial/) or some other tutorials online, and then
get familiar with it by looking at code examples.

We have provided some useful Numpy operations that you may want to use in
`numpy_example.py`. You may want to comment out all the lines first, and execute them
one by one or in a group to see the results and the differences. You can run the command
`python3 numpy_example.py`.

We also provide another python script `feature_normalization.py`, which will guide you
through L2 normalization, covariance matrices, z-score, and whitening. You may find
some code here helpful for your implementation. You can run the command `python3
feature_normalization.py`.

* In `DR.py`, we also provide some more instructions and hints for what functions or
operations you may want to use.

* Caution! python and NumPy’s indices start from 0. That is, to get the first element in a
vector, the index is 0 rather than 1.

VI. Task (25 pts)

You are to add your implementation into `DR.py`.

There are many sub-functions in `DR.py`. You can ignore all of them but `def PCA(X,
out_dim)` and `main(args)`. In `main(args)`, you will see a general pipeline for machine
learning:

– Loading data: `X, phi = data_loader(args)`, in which `X` is a D-by-N matrix (numpy
array) and each column is a data instance. You can type `X[:, 0]` to extract the “first” data
instance from `X`. (**Caution! python and numpy’s indices start from 0. That is, to get
the first element in a vector, the index is 0 rather than 1.**) To ensure that `X[:, 0]` is a
column vector, you may do `X[:, 0].reshape(-1, 1)`, which will give you a column vector
of size D-by-1.

– Learning patterns: `mu, W = PCA(np.copy(X), out_dim)`, in which the code takes `X`
and the desired output dimensions as input and output the mean vector `mu` and the
projection matrix (numpy array) `W`.

– Apply the learned patterns to the data: which will be part of your job to implement.

Coding (15/25 pts):

You have two parts to implement in `DR.py`:

* The function `def PCA(X, out_dim)`: please go to the function and read the input
format, output format, and the instructions (for what to do) carefully. You can assume
that the actual inputs will follow the input format, and your goal is to generate the
required numpy arrays (`mu` and `Sigma`), which will be used to compute the outputs.
Please make sure that your results follow the required numpy array shapes. You are to
implement your code between `### Your job 1 starts here ###` and `### Your job 1 ends
here ###`. You are free to create more space between those two lines: we include them
just to explicitly tell you where you are going to implement.

* Apply the learned patterns: After obtaining the mean vector `mu` and the projection
matrix (numpy array) `W`, you are to apply them to your data `X`. You are to implement
your code between `### Your job 2 starts here ###` and `### Your job 2 ends here ###`.
Again, you are free to create more space between those two lines. You can assume that
`X`, `mu`, and `W` are already defined, and your goal is to create the matrix (numpy
array) `new_X`, which is out_dim-by-N (out_dim and N are both already defined). Each
column (data instance) of `new_X` corresponds to the same column in `X`.

Auto grader:

* You may run the following command to test your implementation

python3 DR.py –data simple_data –auto_grade

Note that, the auto grader is to check your implementation semantics. If you have syntax
errors, you may get python error messages before you can see the auto_graders’ results.

* Again, the auto_grader is just to simply check your implementation for a toy case. It
will not be used for your final grading.

Play with different datasets (Task 1 – toy data, 3/25 pts):

* Run the following command

python3 DR.py –data toy_data –method PCA –out_dim 2 –display
–save

This command will run PCA on a simple angle shape data in 3D and project it to 2D
(defined by –out_dim 2). You will see the resulting mean vector and projection matrix
being displayed in your command line. You will also see a figure showing the data
before and after PCA. Points of similar colors mean that they are similar, and PCA does
preserve such similarity.

* The code will generate `toy_data_2.png` and `Results_toy_data_2.npz`, which you will
include in your submission.

* You may play with other commands by (1) removing `–save` (2) changing the `–
out_dim 2` to 1. You may also remove `–display` if you don’t want to display the figure.

Play with different datasets (Task 2 – MNIST, 3/25 pts):

* Run the following command

python3 DR.py –data MNIST –method PCA –out_dim 2 –display —
save

This command will run PCA on 1010 digit images of digit “3”. The size of each image is
28-by-28, or equivalently a 784-dimensional vector. We are to perform PCA to reduce its
dimensionality (e.g., to 2) and then use the two dimensions to reconstruct the 28-by-28
image. You will see a figure showing multiple images. The leftmost image is the mean
image. The second to the right and the rightmost images are one original “3” image and
the reconstructed image. The middle images show you the projections (here there are two
projections). Note that, in doing PCA, we vectorize an image and get a mean vector and a
projection matrix with several principal components. Then to display them, we then
reshape them back to images. In the for_display folder see MNIST.png.

* The code will generate `MNIST_2.png` and `Results_MNIST_2.npz`, which you will
include in your submission.

* You may play with other commands by (1) removing `–save` (2) changing the `–
out_dim 2` to some other non-negative integers (e.g., 1, 3, 4, 200). You will see that the
reconstructed images get closer to the original image when out_dim approaches 784.

Play with different datasets (Task 3 – Swiss Roll, 4/25 pts):

* Run the following command

python3 DR.py –data Swiss_Roll –method PCA –out_dim 2 —
display –save

This command will run PCA on the 3D Swiss Roll dataset to reduce the dimensionality to
2D. You will see the resulting mean vector and projection matrix being displayed in your
command line. You will also see a figure showing the data before and after PCA. Points
of similar colors mean that they are similar (following the Swiss Roll shape in and out).
You will see that PCA cannot preserve the similarity. This is because that PCA can only
do linear projection: simply flatten the roll but not unfold it. In the folder for_display, see
Swiss_Roll.png.

* The code will generate `Swiss_Roll_2.png` and `Results_Swiss_Roll_2.npz`, which
you will include in your submission.

* You may play with other commands by (1) removing `–save` (2) changing the `–
out_dim 2` to 1. You may also remove `–display` if you don’t want to display the figure.

Extension: nonlinear dimensionality reduction (No credit, but please do it)

We have implemented this for you!

Above you see that PCA cannot preserve the similarity (neighbors) along with the Swiss
Roll. Nonlinear dimensionality reduction algorithms aim to preserve the neighbors. Here
you are to play with one algorithm, Laplacian Eigenmap (LE).

* Run the following command

python3 DR.py –data Swiss_Roll –method LE –out_dim 2 –display

This command will run LE on the 3D Swiss Roll dataset to reduce the dimensionality to
2D. You will also see a figure showing the data before and after LE. Points of similar
colors mean that they are similar (following the Swiss Roll shape in and out). You will
see that LE can preserve the similarity, unfolding the roll. In the for_display folder see
Swiss_LE.png.

VII. Carmen submission

Submit:

x Your completed python script `DR.py`

x Your 6 generated results: `MNIST_2.png`, `Swiss_Roll_2.png`, `toy_data_2.png`,
`Results_MNIST_2.npz`, `Results_Swiss_Roll_2.npz`, and
`Results_toy_data_2.npz`.

x collaboration.txt which lists with whom you have discussed the homework (see
more details at the beginning of this write-up under Collaboration & Piazza).

Submit your solution to this programming portion, into one .zip file named
HW3_name_dotnumber.zip (e.g., HW3_shareef_1.zip).