代写代考 CSE 3521: Neural Networks

PowerPoint Presentation

CSE 3521: Neural Networks

Copyright By PowCoder代写 加微信 powcoder

Perceptron
Multi-layer Perceptron (MLP)

Introduction to Neural Networks
Researchers in this are also called “connectionists”

Introduction to Neural Networks

Feed-forward neural network

Introduction to Neural Networks

Introduction to Neural Networks

Introduction to Neural Networks

Introduction to Neural Networks

Introduction to Neural Networks

Perceptron
The brain classifies exceptionally well … let’s model it!!
Collects impulses
When a threshold is reached, generates an impulse

Perceptron (single neural for classification)

Step (sign)
Learnable weights

Perceptron (single neural for classification)

Step (sign)
Learnable weights

Perceptron (single neural for classification)

Step (sign)
Learnable weights

Perceptron
A linear classifier

Perceptron: The NOT operator
“Active” when the correct input is given
“Inactive” when the wrong input is given

The Not Operator!

Threshold: The Bias Weight

W0x0+ W1x1 > T
W0x0+ W1x1 – T > 0

Perceptron: The AND operator

y(x0, x1) = 1 x0 + x1 – 1.5 > 0
= 0 otherwise

= Output 0
= Output 1

Perceptron: The OR operator

y(x0, x1) = 1 x0 + x1 – 0.5 > 0
= 0 otherwise

= Output 0
= Output 1

Perceptron: The XOR operator

y(x0, x1) = 1 ?? > 0
= 0 otherwise

= Output 0
= Output 1

Perceptron: The XOR operator

= Output 0
= Output 1

-1 x0 x1

There is no assignment of values to W, W0 and W1 that
satisfies above inequalities. XOR cannot be represented!

Single Perceptron
Representation theorem ( & – 1969)

A single-layer perceptron could not learn a simple XOR function (amongst others that are non-linearly separable)

Widely misinterpreted as showing that artificial neural networks were inherently limited

The reputation of neural network research declined through the 70s and 80s

Perceptron Algorithm
Let and and ignore ,

Initialize weight vector
Loop for T iterations
Loop for all training examples (random order!)

Perceptron Algorithm

Perceptron Algorithm

Perceptron Algorithm

Perceptron Algorithm
Why does it work?

Perceptron Algorithm
Let and and ignore ,

Initialize weight vector
Online version: Get on training data

Perceptron Algorithm
Convergence:
If the data is linear separable

Not convergence:
If the data is not linear separable

Perceptron
Multi-layer Perceptron (MLP)

“Artificial” Neuron
A basic “computational unit” of neural networks

Multi-layer Perceptron
XOR Solved!

Multi-layer Perceptron
Complex Boolean Functions: y = (x0  x1)  x2

But remember, x0 XOR x1 = (x0  x1)  (x0  x1)

y(x0, x1, x2)

Multi-layer perceptron

Multi-layer perceptron
Capable to learn nonlinear functions

How to learn?
Perceptron algorithm?
Back-propagation
Mini-batch gradient descent

/docProps/thumbnail.jpeg

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com