CS计算机代考程序代写 algorithm matlab COMP3308/3608 Artificial Intelligence

COMP3308/3608 Artificial Intelligence
Week 8 Tutorial exercises Perceptrons. Multilayer Neural Networks 1.
Exercise 1. Perceptron learning (Homework)
The homework for this week consists of 5 multiple-choice questions with two possible answers – True and False. To complete the homework, please go to Canvas -> Quizzes->w8-homework-submission. Remember to press the “Submit” button at the end. Only 1 attempt is allowed.
For each question below, select the correct answer:
1. Given are the following 2-dimensional examples from two classes (class I and class II). class I: p1 = [1 1], p2 = [-1 -1]
class II: p3 = [2 2]
Can a perceptron learn to distinguish between them? Yes No
2. (The same question as above but for different examples.) Given are the following 2-dimensional examples from two classes (class I and class II).
class I: p1 = [1 1], p2 = [1 -1] class II: p3 = [1 0]
Yes No
Can a perceptron learn to distinguish between them?
3. If the data is linearly separable, the perceptron will optimize the decision boundary and is guaranteed to find the best possible boundary that separates the examples in a finite number of steps.
True False
4. A single perceptron can solve the XOR problem.
True False
5. The perceptron’s learning rule reflects exactly the operation of the human neurons. True False
COMP3308/3608 Artificial Intelligence, s1 2021
1

Exercise 2. Perceptron learning rule Given is the following training set:
a) Train by hand a perceptron with a bias on this training set. Assume that all initial weights (including the bias of the neuron) are 0. Show the set of weights (including the bias) at the end of each iteration. Apply the examples in the given order.
Stopping criterion: all examples are correctly classified or a maximum number of 2 epochs is reached. Reminder: The stopping criterion is checked at the end of each epoch; to check the first part, apply each training example, check if it is correctly classified and stop if all examples are correctly classified. Note that when doing this, there is no weight change just calculating the actual output and comparing it with the target output.
Use the step function and note that step(0)=1. step(n) = 1, if n >= 0
= 0, otherwise.
b) Which stopping condition was satisfied? If it was the first one, how many epochs were needed?
Perceptron – Matlab exercise
Some of the exercises below are based on the Matlab’s Neural Network toolbox. This toolbox allows to build and test neural network prototypes quickly. The goal of these exercises is to illustrate some important neural network concepts. You are not expected to know how to program in Matlab, we will use Matlab only for demonstration, and only this and next week.
How to install Matlab on your computer (it is free for students) – 2 download options:
Option 1. Via the University’s webpage
Step 1: Go to the University’s Matlab page to download Matlab (VPN needed) http://softserv.usyd.edu.au/data/MatLab/
Option 2. Via Matlab’s webpage (no VPN needed)
Step 1: Go to the Matlab’s webpage: https://protect-au.mimecast.com/s/A05wCk81N9t48DnmT255Br?domain=au.mathworks.com
Step 2: Create an account using your student email.
Step 3: Download the latest version of MATLAB and use your username and password to activate it during the installation process
Note on the neural networks implemented in Weka: Weka includes multilayer perceptron and other neural networks (e.g. RBF networks – not covered in this course). It doesn’t include single perceptron but includes “Voted perceptrons”, an algorithm developed by Freud and Schapire, which combines the
COMP3308/3608 Artificial Intelligence, s1 2021
ex. 1: ex. 2: ex. 3: ex. 4: ex. 5: ex. 6:
input output 10 0 1
01 1 0
1 10 1
11 1 0 00 1 0 10 1 1
2

decisions of several perceptrons by voting, i.e. it is an ensemble of classifiers. We will study ensembles of classifiers later in the course.
Exercise 3. Banana/orange classification using perceptron
Start Matlab. Type nnd3pc to run the demo. This is the banana/orange example from the lecture. A perceptron has been already trained to classify the fruits into 2 classes, the demo represents the classifications of new fruits. Follow the instructions of the demo and observe the calculations at each step during the classification. Make sure that you switch on the sound for this exercise!
Exercise 4. Perceptron decision boundaries
In Matlab, type nnd4db to run the demo.
Move the perceptron decision boundary by dragging its handles and try to separate the examples (white and black circles) into 2 classes, i.e. so that none of them has red edges. You can insert more examples by dragging the white and black circles from the left part. The weights and bias will take the values associated with the new boundary. What is the relation between the decision boundary and the weight vector?
Exercise 5. Perceptron learning
In Matlab, type rule nnd4pr to run the demo.
1) Press Train to apply the perceptron rule 5 times. Each time an example is applied, the weights are
adapted (their values are shown) and the new decision boundary is plotted. 2) Learning linearly separable and linearly non-separable patterns
a) Add by dragging some new white and black points to define a linearly separable problem. Train the perceptron. Will the perceptron be able to separate the patterns into 2 classes?
b) Now create a linearly non-separable problem. The same question as above.
3) Try the learning with and without bias. What is the influence of the bias? What happens if there is no bias and if b=0?
Exercise 6. Perceptron – the importance of the bias weight
A follow-up from the previous exercise, here we again focus on the meaning and importance of the bias weight.
Suppose that you need to solve the classification problem shown below using a perceptron. The target of 0 is represented with a white circle and target of 1 is represented with a black circle.
a) Draw a possible decision boundary.
COMP3308/3608 Artificial Intelligence, s1 2021
3

COMP3308/3608 Artificial Intelligence, s1 2021 b) Can this problem be solved with a perceptron with step transfer function without a bias? Why?
Reminder:
Exercise 7. Perceptron and the XOR problem (Advanced students only)
Show analytically that the perceptron cannot solve the XOR problem. Hint: Use a system of inequalities.
Multilayer perceptron trained with the backpropagation algorithm – Matlab exercises
Exercise 8. Backpropagation calculation for 2-layer network
In Matlab, type nnd11bc to run the demo.
Press the first button and observe the basic stages of the backpropagation algorithm – applying an example, output calculation, error calculation, error propagation and weights update.
Exercise 9: Error space and convergence; local and global minima
In Matlab, type nnd12sd1 or select “Steepest Descent Backpropagation” from Demos -> Toolboxes –
> Neural Networks (upper left window)
Use the radio buttons to select two parameters. The corresponding error surface will be plotted. Select a starting point and observe the convergence process. Is the backpropagation algorithm guaranteed to find the global minimum?
step(n)=1 if n0 0 if n0
4