Main Content

layerGraph

深度学习的网络层图

Description

A layer graph specifies the architecture of a deep learning network with a more complex graph structure in which layers can have inputs from multiple layers and outputs to multiple layers. Networks with this structure are called directed acyclic graph (DAG) networks. After you create alayerGraphobject, you can use the object functions to plot the graph and modify it by adding, removing, connecting, and disconnecting layers. To train the network, use the layer graph as input to thetrainNetworkfunction or convert it to adlnetworkand train it using a custom training loop.

Creation

Description

example

lgraph= layergraphcreates an empty layer graph that contains no layers. You can add layers to the empty graph by using theaddLayersfunction.

example

lgraph= layergraph(layers)creates a layer graph from an array of network layers and sets theLayersproperty. The layers inlgraphare connected in the same sequential order as inlayers.

example

lgraph= layergraph(net)extracts the layer graph of aSeriesNetwork,DAGNetwork, ordlnetwork对象。例如,您可以提取层草ph of a pretrained network to perform transfer learning.

Input Arguments

expand all

Deep learning network, specified as aSeriesNetwork,DAGNetwork, ordlnetwork对象。

Properties

expand all

This property is read-only.

网络层,指定为Layerarray.

This property is read-only.

Layer connections, specified as a table with two columns.

Each table row represents a connection in the layer graph. The first column,Source, specifies the source of each connection. The second column,Destination, specifies the destination of each connection. The connection sources and destinations are either layer names or have the form'layerName/IOName', where'IOName'is the name of the layer input or output.

Data Types:table

This property is read-only.

Network input layer names, specified as a cell array of character vectors.

Data Types:cell

Network output layer names, specified as a cell array of character vectors.

Data Types:cell

Object Functions

addLayers Add layers to layer graph
removeLayers Remove layers from layer graph
replaceLayer Replace layer in layer graph
connectLayers Connect layers in layer graph
disconnectLayers Disconnect layers in layer graph
plot Plot neural network layer graph

Examples

collapse all

Create an empty layer graph and an array of layers. Add the layers to the layer graph and plot the graph.addLayersconnects the layers sequentially.

lgraph = layerGraph; layers = [ imageInputLayer([32 32 3],'Name','input') convolution2dLayer(3,16,'Padding','same','Name','conv_1') batchNormalizationLayer('Name','BN_1') reluLayer('Name','relu_1')]; lgraph = addLayers(lgraph,layers); figure plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

Create an array of layers.

layers = [ImageInputlayer([28 28 1],,'Name','input') convolution2dLayer(3,16,'Padding','same','Name','conv_1') batchNormalizationLayer('Name','BN_1') reluLayer('Name','relu_1')];

Create a layer graph from the layer array.layerGraphconnects all the layers inlayerssequentially. Plot the layer graph.

lgraph = layerGraph(layers); figure plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

Load a pretrained SqueezeNet network. You can use this trained network for classification and prediction.

net = squeezenet;

To modify the network structure, first extract the structure of the DAG network by usinglayerGraph.You can then use the object functions ofLayerGraphto modify the network architecture.

lgraph = layerGraph(net)
lgraph = LayerGraph with properties: Layers: [68x1 nnet.cnn.layer.Layer] Connections: [75x2 table] InputNames: {'data'} OutputNames: {'ClassificationLayer_predictions'}

Create a simple directed acyclic graph (DAG) network for deep learning. Train the network to classify images of digits. The simple network in this example consists of:

  • A main branch with layers connected sequentially.

  • Ashortcut connectioncontaining a single 1-by-1 convolutional layer. Shortcut connections enable the parameter gradients to flow more easily from the output layer to the earlier layers of the network.

Create the main branch of the network as a layer array. The addition layer sums multiple inputs element-wise. Specify the number of inputs for the addition layer to sum. To easily add connections later, specify names for the first ReLU layer and the addition layer.

layers = [ imageInputLayer([28 28 1]) convolution2dLayer(5,16,'Padding','same') batchNormalizationLayer reluLayer('Name','relu_1') convolution2dLayer(3,32,'Padding','same','Stride',2) batchNormalizationLayer reluLayer convolution2dLayer(3,32,'Padding','same') batchNormalizationLayer reluLayer additionLayer(2,'Name','add') averagePooling2dLayer(2,'Stride',2) fullyConnectedLayer(10) softmaxLayer classificationLayer];

Create a layer graph from the layer array.layerGraphconnects all the layers inlayerssequentially. Plot the layer graph.

lgraph = layerGraph(layers); figure plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

Create the 1-by-1 convolutional layer and add it to the layer graph. Specify the number of convolutional filters and the stride so that the activation size matches the activation size of the third ReLU layer. This arrangement enables the addition layer to add the outputs of the third ReLU layer and the 1-by-1 convolutional layer. To check that the layer is in the graph, plot the layer graph.

skipConv = convolution2dLayer(1,32,'Stride',2,'Name','skipConv'); lgraph = addLayers(lgraph,skipConv); figure plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

创建快捷方式连接from the'relu_1'layer to the'add'layer. Because you specified two as the number of inputs to the addition layer when you created it, the layer has two inputs named'in1'and'in2'. The third ReLU layer is already connected to the'in1'input. Connect the'relu_1'layer to the'skipConv'layer and the'skipConv'layer to the'in2'input of the'add'layer. The addition layer now sums the outputs of the third ReLU layer and the'skipConv'layer. To check that the layers are connected correctly, plot the layer graph.

lgraph = connectLayers(lgraph,'relu_1','skipConv'); lgraph = connectLayers(lgraph,'skipConv','add/in2'); figure plot(lgraph);

Figure contains an axes object. The axes object contains an object of type graphplot.

Load the training and validation data, which consists of 28-by-28 grayscale images of digits.

[XTrain,YTrain] = digitTrain4DArrayData; [XValidation,YValidation] = digitTest4DArrayData;

Specify training options and train the network.trainNetworkvalidates the network using the validation data everyValidationFrequencyiterations.

options = trainingOptions('sgdm',...“MaxEpochs”,8,...'Shuffle','every-epoch',...'ValidationData',{XValidation,YValidation},...'ValidationFrequency',30,...'Verbose',false,...“阴谋”,'training-progress'); net = trainNetwork(XTrain,YTrain,lgraph,options);

Figure Training Progress (26-Feb-2022 11:14:06) contains 2 axes objects and another object of type uigridlayout. Axes object 1 contains 15 objects of type patch, text, line. Axes object 2 contains 15 objects of type patch, text, line.

Display the properties of the trained network. The network is aDAGNetwork对象。

net
net = DAGNetwork with properties: Layers: [16x1 nnet.cnn.layer.Layer] Connections: [16x2 table] InputNames: {'imageinput'} OutputNames: {'classoutput'}

Classify the validation images and calculate the accuracy. The network is very accurate.

YPredicted = classify(net,XValidation); accuracy = mean(YPredicted == YValidation)
accuracy = 0.9934

Version History

Introduced in R2017b