You can use the package Theano – a Python version from U. Montreal (Linux/MacOS/Windows), or any other packages or your own implementation, perform the following tasks:
- a) On the miniboone dataset, train a neural network with one hidden layer, with
neurons in the hidden layer and ReLU activation functions
for the hidden layer, and no activation function for the output layer. For each k find an appropriate learning rate and minibatch size to obtain a small final loss value on the training set after 100-300 epochs. Report in a table the misclassification errors for the four models on the training and test sets. Observe that since the miniboone data does not have a set training or test set, you should present results as the average of 10 independent random splits, each split using a random subsample of 80% of the data for training and the remaining 20% for testing.
- b) Repeat point a) with a neural network with two hidden layers, with 128 neurons in the first layer and neurons in the second layer and ReLU activation functions.
- c) Repeat point a) on the madelon For madelon you don’t need to do the random splits, just use the training and test set.
- d) Repeat point b) on the madelon
Dataset Detail