MATLAB神经网络及其应用

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

6 预测及分析
sim输出 重新训练并sim输出 画图对比
7 程序实现
clc clear all clear net load data; load data_pre;
c1=in(:,1); c2=in(:,2); c3=in(:,3); c4=in(:,4); c5=in(:,5); c6=in(:,6); c7=in(:,7); c8=in(:,8);
输入参数数据结构说明
Wheຫໍສະໝຸດ Baidue:
Ni = net.numInputs Nl = net.numLayers Nt = net.numTargets ID = net.numInputDelays LD = net.numLayerDelays TS = number of time steps Q = batch size Ri = net.inputs{i}.size Si = net.layers{i}.size Vi = net.targets{i}.size
c1_min=min(c1);
c5_min=min(c5);
% c2_max=max(c2); % c6_max=max(c6);
c2_min=min(c2); c6_min=min(c6);
% c3_max=max(c3); % c7_max=max(c7);
c3_min=min(c3); c7_min=min(c7);
输入样本的确定 标准输出的确定 网络训练参数(次数)的确定
net. trainParam.epochs=100
调用网络训练命令:net=train(net,p,t);
进行输出模拟
调用y=sim(net,p)进行输出模拟 画图进行对比
查看网络参数及权值
net net参数引用和查看
TRAIN(NET,P,T,Pi,Ai) takes,
NET - Network. P - Network inputs. T - Network targets, default = zeros. Pi - Initial input delay conditions, default = zeros. Ai - Initial layer delay conditions, default = zeros. VV - Structure of validation vectors, default = []. TV - Structure of test vectors, default = [].
说明
Note that T is optional and need only be used for networks that require targets.
Pi and Pf are also optional and need only be used for networks that have input or layer delays.
5 实现
数据处理和准备
把WORD数据转换成TXT文件格式 利用dlmread读取数据 是否进行归一化处理?
生成网络
为调用newff命令做好各种准备
pr矩阵的形成 网络结构确定:网络层数以及每层的神经元
个数 每一层的传输函数的确定
注意参数的含义
进行网络训练
为调用train命令进行数据准备
输出参数说明
and returns,
NET - New network. TR - Training record (epoch and perf). Y - Network outputs. E - Network errors. Pf - Final input delay conditions. Af - Final layer delay conditions.
The training function BTF can be any of the backprop training functions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc.
参数说明
*WARNING*: TRAINLM is the default training function because it is very fast, but it requires a lot of memory to run. If you get an "out-of-memory" error when training try doing one of these:
% c4_max=max(c4); % c8_max=max(c8);
c4_min=min(c4); c8_min=min(c8);

pr=[c1_min,c1_max;c2_min, c2_max;c3_min,c3_max;c4_ min,c4_max;c5_min,c5_max; c6_min,c6_max;c7_min,c7_ max;c8_min,c8_max;];
P - NixTS cell array, each element P{i,ts} is an RixQ matrix. T - NtxTS cell array, each element P{i,ts} is an VixQ matrix. Pi - NixID cell array, each element Pi{i,k} is an RixQ matrix. Ai - NlxLD cell array, each element Ai{i,k} is an SixQ matrix. Y - NOxTS cell array, each element Y{i,ts} is an UixQ matrix. E - NtxTS cell array, each element P{i,ts} is an VixQ matrix. Pf - NixID cell array, each element Pf{i,k} is an RixQ matrix. Af - NlxLD cell array, each element Af{i,k} is an SixQ matrix.
c1_max=max(c1); c2_max=max(c2); c3_max=max(c3); c4_max=max(c4); c5_max=max(c5); c6_max=max(c6); c7_max=max(c7); c8_max=max(c8);

% c1=c1/c1_max; % c2=c2/c2_max; % c3=c3/c3_max; % c4=c4/c4_max; % c5=c5/c5_max; % c6=c6/c6_max; % c7=c7/c7_max; % c8=c8/c8_max; %
Syntax
net = newff net = newff(PR,[S1 S2...SNl],{TF1
TF2...TFNl},BTF,BLF,PF)
命令newff中的参数说明
NET = NEWFF creates a new network with a dialog box.
NEWFF(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF) takes,
% in(:,1)=c1; % in(:,2)=c2; % in(:,3)=c3; % in(:,4)=c4; % in(:,5)=c5; % in(:,6)=c6; % in(:,7)=c7; % in(:,8)=c8; %

% c1_max=max(c1); % c5_max=max(c5);
PR - Rx2 matrix of min and max values for R input elements. Si - Size of ith layer, for Nl layers. TFi - Transfer function of ith layer, default = 'tansig'. BTF - Backprop network training function, default = 'trainlm'. BLF - Backprop weight/bias learning function, default =
The performance function can be any of the differentiable performance functions such as MSE or MSEREG.
4 MATLAB中的train命令
TRAIN Train a neural network.
MATLAB中的神经网络及其应 用:以BP为例
主讲:王茂芝 副教授 wangmzcdut.edu
1 一个预测问题
已知:一组标准输入和输出数据(见附件) 求解:预测另外一组输入对应的输出 背景:略
2 BP网络
3 MATLAB中的newff命令
NEWFF Create a feed-forward backpropagation network.
Syntax
[net,tr,Y,E,Pf,Af] = train(NET,P,T,Pi,Ai,VV,TV)
Description
TRAIN trains a network NET according to NET.trainFcn and NET.trainParam.
输入参数说明
(3) Use TRAINRP which is slower but more memory efficient than TRAINBFG.
参数说明
The learning function BLF can be either of the backpropagation learning functions such as LEARNGD, or LEARNGDM.
'learngdm'. PF - Performance function, default = 'mse'.
and returns an N layer feed-forward backprop network.
参数说明
The transfer functions TFi can be any differentiable transfer function such as TANSIG, LOGSIG, or PURELIN.
p=in'; t=out';
net.trainParam.epochs=100; net=train(net,p,t); y=sim(net,p);
输入参数数据结构说明
The cell array format is easiest to describe. It is most convenient for networks with multiple inputs and outputs, and allows sequences of inputs to be presented:
(1) Slow TRAINLM training, but reduce memory requirements, by setting NET.trainParam.mem_reduc to 2 or more. (See HELP TRAINLM.)
(2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM.
相关文档
最新文档