Introduction to MATLAB Neural Network Toolbox How to Use Neural Network Toolbox

2012/10/25
Introduction to MATLAB Neural
Network Toolbox
Static Neural Networks
2012/10/22
1
How to Use Neural Network Toolbox
M-file Editor or Command Window
Graphical User Interface (GUI) >> nntool
Simulink >> Neural Network Toolbox
2
1
2012/10/25
Basic Concept
 Date collection:
 Training data
 Testing data
x1
 Network creation:
w1
x2
w2
Axon
 Static networks
 Dynamic networks
…
Output
 No. of parameters:
wn
xn
Activation
function
(Cell body)
Synaptic
weights
 Hidden layers, neurons, …
Desired
Output
u
System
 Training parameters:
y
 Epochs, learning rate, …
Training
Testing
3
Data Preprocessing & Postprocessing
Normalization formula:
Y  Ymin  (Ymax  Ymin )  (X  Xmin ) / (Xmax  Xmin )
Process matrices by mapping row minimum and maximum values to [Ymin, Ymax]
Syntax: mapminmax
>>[Y, PS]=mapminmax(X);
 PS: Process settings that allow consistent processing of values
>>[Y, PS]=mapminmax(X, Ymin, Ymax);
>>X=mapminmax(‘reverse’, Y, PS);
4
2
2012/10/25
Neural Network Models
1. Perceptron
2. Linear Filters
3. Backpropagation
4. Radial Basis Networks
5. Competitive Networks
6. Learning Vector
Quantization Network
7. Recurrent Networks
8. NARX Networks
5
Transfer Function Graphs
6
3
2012/10/25
Learning Algorithms
7
Feedforward Neural Networks
 Scalar: a, b, c;
 Vector: a, b, c;
 Matrix: A, B, C;
P (r×N): Input data (r: No. of input elements, N: No. of input patterns)
T (m×N): Output data (m: No. of output elements)
n: nth layer, including the hidden and output layers
IWk, l: Input weight matrix (lth input set to kth layer)
LWk, l: Layer weight matrix (lth layer to kth layer)
8
4
2012/10/25
Feedforward Neural Networks
9
Feedforward Neural Networks
Syntax: newff
net = newff(P, T, [S1 … S(n-1)], {TF1 … TFn}, BTF)
Si: Size of ith layer, for N-1 layers.
(Default = [].)
TFi: Transfer function of ith layer.
(Default = ‘tansig’ for hidden layers and ‘purelin’ for output layer.)
BTF: Backpropagation network training function
(Default = ‘trainlm’)
10
5
2012/10/25
Feedforward Neural Networks
 Example 1:
t = abs(p)
Training input data: P = [0 -1 2 -3 4 -5 6 -7 8 -9 10];
Training output data: T = [0 1 2 3 4 5 6 7 8 9 10];
>>clc; clear all; close all;
>>% Collect data
>>P = [0 -1 2 -3 4 -5 6 -7 8 -9 10];
>>T = [0 1 2 3 4 5 6 7 8 9 10];
>>% Create network
>>net = newff(P,T,10);
>>net
>>% Training
>>net = train(net, P, T);
>>% Testing
>>y = sim(net,P);
>>% Error
>>error = mse(y-T)
11
Feedforward Neural Networks
12
6
2012/10/25
Train Feedforward Neural Networks
Set training parameter values: net.trainParam
>>net.trainParam.epochs = 100;
>>net.trainParam.show = 25 or NaN;
>>net.trainParam.lr = 0.01;
>>net.trainParam.goal = 0;
Train the network
>>[net, tr] = train(net, P, T);
Simulate or test the network
>>y = sim(net, Pt);
Testing input data: Pt = [0 1 -2 3 -4 5 -6 7 -8 9 -10];
Testing output data: Tt = [0 1 2 3 4 5 6 7 8 9 10];
13
Graph User Interface (GUI)
>>nntool
14
7
2012/10/25
Practice 1
Please create a feedforward neural network and to
perform the iris classification problem.
 Show the network structure and all learning results including the
learning curves with different user-specified parameters for training and
test data, the train parameters, the resulted membership functions,
and/or the weight changes.
 Show the recognition results for the neural network.
The training/testing data are available on the course website (http://140.116.215.51)
‘practice1.mat’ include:
 P (4x135)
 T (1x135)
 Pt (4x15)
 Tt (1x15)
15
16
8