主要内容

Narxnet

Nonlinear autoregressive neural network with external input

描述

example

narxnet(inputDelays,反馈,hiddenSizes,反馈模式,trainFcn)采取这些论点:

  • Row vector of increasing 0 or positive input delays,inputDelays

  • 增加0或积极反馈延迟的行矢量,反馈

  • Row vector of one or more hidden layer sizes,hiddenSizes

  • Type of feedback,反馈模式

  • 反向传播训练函数,trainFcn

和returns a NARX neural network.

NARX (Nonlinear autoregressive with external input) networks can learn to predict one time series given past values of the same time series, the feedback input, and another time series called the external (or exogenous) time series.

例子

collapse all

Train a nonlinear autoregressive with external input (NARX) neural network and predict on new time series data. Predicting a sequence of values in a time series is also known as多步骤预测. Closed-loop networks can perform multistep predictions. When external feedback is missing, closed-loop networks can continue to predict by using internal feedback. In NARX prediction, the future values of a time series are predicted from past values of that series, the feedback input, and an external time series.

Load the simple time series prediction data.

[X,T] = simpleseries_dataset;

Partition the data into training dataXTrainTTrain, and data for predictionXPredict. UseXPredict创建闭环网络后执行预测。

XTrain = X(1:80); TTrain = T(1:80); XPredict = X(81:100);

创建一个Narx网络。定义隐藏图层的输入延迟,反馈延迟和大小。

net = narxnet(1:2,1:2,10);

使用时间序列数据使用preparets. This function automatically shifts input and target time series by the number of steps needed to fill the initial input and layer delay states.

[xs,xi,ai,ts] = preparets(net,xtrain,{},ttrain);

推荐的做法是将网络完全创建开放循环,然后将网络转换为封闭的循环以进行多步骤预测。然后,闭环网络可以预测您想要的尽可能多的未来值。如果您仅以闭环模式模拟神经网络,则网络可以执行与输入系列中的时间步长一样多的预测。

训练Narx网络。这trainfunction trains the network in an open loop (series-parallel architecture), including the validation and testing steps.

net =火车(net,xs,ts,xi,ai);

Figure Neural Network Training (26-Feb-2022 11:01:35) contains an object of type uigridlayout.

Display the trained network.

查看(网)

Figure Neural Network Training (26-Feb-2022 11:01:35) contains an object of type uigridlayout.

Calculate the network outputY, final input statesXf, and final layer statesAfof the open-loop network from the network inputXs, initial input statesXi和初始层状态Ai.

[Y,Xf,Af] = net(Xs,Xi,Ai);

计算网络性能。

perf = perform(net,Ts,Y)
perf = 0.0153

为了预测接下来的20个时间步长的输出,请先在闭环模式下模拟网络。最终输入状态Xf和层状态Af开环网络网络成为初始输入状态xic和层状态Aicof the closed-loop networknetc.

[netc,Xic,Aic] = closeloop(net,Xf,Af);

Display the closed-loop network.

view(netc)

Run the prediction for 20 time steps ahead in closed-loop mode.

Yc = netc(XPredict,Xic,Aic)
Yc=1×20单元格数组列1至5 {[-0.0156]} {[0.1133]} {[-0.1472]} {[-0.0706]} {[-0.0706]} {[0.0355]}列6至10 {[-0.2829]}0.3809]}} {[-0.2836]} {[[0.1886]}列11至15 {[-0.1813]} {[0.1373]} {[0.1373]} {[0.2189]}-0.0156]}} {[0.0724]} {[0.3395]} {[0.1940]} {[0.0757]}

Input Arguments

collapse all

零或正输入延迟,指定为增加的行向量。

Zero or positive feedback delays, specified as an increasing row vector.

Sizes of the hidden layers, specified as a row vector of one or more elements.

反馈的类型,指定为'打开','closed', or'none'.

Training function name, specified as one of the following.

Training Function Algorithm
'trainlm'

Levenberg-Marquardt

“火车”

Bayesian Regularization

'trainbfg'

BFGS Quasi-Newton

'trainrp'

Resilient Backpropagation

'Trainscg'

Scaled Conjugate Gradient

'traincgb'

Conjugate Gradient with Powell/Beale Restarts

'traincgf'

Fletcher-Powell Conjugate Gradient

'traincgp'

Polak-Ribiére Conjugate Gradient

'Trainoss'

一步割线

'traingdx'

Variable Learning Rate Gradient Descent

'traingdm'

梯度下降with Momentum

'traingd'

梯度下降

例子:例如,您可以将可变学习率梯度下降算法指定为培训算法如下:'traingdx'

For more information on the training functions, seeTrain and Apply Multilayer Shallow Neural Networks选择多层神经网络培训功能.

Data Types:char

Version History

在R2010b中引入