IV. Materials provided
A zip file is provided for you on Carmen called HW3_Programming.zip. Download and
extract the files onto your computer.
You will use the code in the zip file provided in the previous homework (HW #3). Please download that zip file again if you need to.
Copyright By PowCoder代写 加微信 powcoder
You will find a file to download called ‘LR.py’ from carmen where you found this assignment description.
You will use python 3 to run your solution.
You will play with the Linear data, Quadratic data. See linear_1.png and the images quadratic_*.png in the for_display folder.
V. Preliminaries
As you did in HW #3, you will use NumPy (https://numpy.org/) extensively in this homework. NumPy a library for the Python programming language, adding support for large, multi-dimensional arrays and matrices, along with a large collection of high-level mathematical functions to operate on these arrays. NumPy has many great functions and operations that will make your implementation much easier.
If you are not familiar with Numpy, we recommend that you read this tutorial (https://cs231n.github.io/python-numpy-tutorial/) or some other tutorials online, and then get familiar with it by looking at code examples.
We have provided some useful Numpy operations that you may want to use in `numpy_example.py`. You may want to comment out all the lines first, and execute them one by one or in a group to see the results and the differences. You can run the command `python3 numpy_example.py`.
* In `LR.py`, we also provide some more instructions and hints for what functions or operations you may want to use.
* Caution! python and NumPy’s indices start from 0. That is, to get the first element in a vector, the index is 0 rather than 1.
VI. Task (25 pts)
You are to add your implementation into `LR.py`.
There are many sub-functions in `LR.py`. You can ignore all of them but `def linear_regression(X, Y)` and `def main(args)`. In `main(args)`, you will see a general pipeline of machine learning:
– Loading data: `X_original, Y = data_loader(args)`, in which `X_original` is a 1-by-N matrix (numpy array) and each column is a data instance. You can type `X_original[:, 0]` to extract the “first” data instance from `X_original`. (Caution! python and numpy’s indices start from 0. That is, to get the first element in a vector, the index is 0 rather than 1.)
– Feature transform: `X = polynomial_transform(np.copy(X_original), args.polynomial)` extends each column of `X_original` to its polynomial representation. For example, given a scalar x, this transform will extends it to [x, x^2, …, x^(args.polynomial)]^T.
– Learning patterns: `w, b = linear_regression(X, Y)`, in which the code takes `X` and the desired labels `Y` as input and output the weights `w` and the bias `b`.
– Apply the learned patterns to the data: `training_error = np.mean((np.matmul(w.transpose(), X) + b – Y.transpose()) ** 2)` and `test_error = np.mean((np.matmul(w.transpose(), X_test) + b – Y_test.transpose()) ** 2)` compute the training and test error.
Coding (15/25 pts):
You have one part to implement in `LR.py`:
* The function `def linear_regression(X, Y)`: please go to the function and read the input format, output format, and the instructions carefully. You can assume that the actual inputs will follow the input format, and your goal is to generate the required numpy arrays (`w` and `b`), the weights and bias of linear regression. Please make sure that your results follow the required numpy array shapes. You are to implement your code between `### Your job starts here ###` and `### Your job ends here ###`. You are free to create more space between those two lines: we include them just to explicitly tell you where you are going to implement.
Auto grader:
* You may run the following command to test your implementation
python3 LR.py –data simple –auto_grade
Note that, the auto grader is to check your implementation semantics. If you have syntax errors, you may get python error messages before you can see the auto_graders’ results.
* Again, the auto_grader is just to simply check your implementation for a toy case. It will not be used for your final grading.
Play with different datasets (Task 1 – linear data, 5/25 pts):
* Run the following command
python3 LR.py –data linear –polynomial 1 –display –save
This command will run linear regression on a 1D linear data (the x-axis is the feature and the y-axis is the label). You will see the resulting `w` and `b` being displayed in your command line. You will also see the training (on red points) and test error (on blue points).
* The code will generate `linear_1.png` and `Results_linear_1.npz`, which you will include in your submission.
* You may play with other commands by (1) removing `–save` (2) changing the `– polynomial 1` to a non-negative integer (e.g, 2, 3, …, 13). You will see that, while larger values lead to smaller training errors, the test error is not necessarily lower. For very large value, the test error can go very large.
Play with different datasets (Task 2 – quadratic, 5/25 pts):
* Run the following command
python3 LR.py –data quadratic –polynomial 2 –display –save
This command will run linear regression on a 1D quadratic data (the x-axis is the feature and the y-axis is the label). The code will produce polynomial = 2 representation for the data (i.e., `X` becomes 2-by-N). You will see the resulting `w` and `b` being displayed in your command line. You will also see the training (on red points) and test error (on blue points).
* The code will generate `quadratic_2.png` and `Results_quadratic_2.npz`, which you will include in your submission.
* You may play with other commands by (1) removing `–save` (2) changing the `– polynomial 2` to a non-negative integer (e.g, 1, 3, …, 13). You will see that, while larger values lead to smaller training error, the test error is not neccessarily lower. For very large value, the test error can go verly large.
Carmen submission
Your completed python script `LR.py`
Your 4 generated results for Question 2: `linear_1.png`, `quadratic_2.png`,
`Results_linear_1.npz`, and `Results_quadratic_2.npz`
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com