主要内容

classificationLayer

分类输出层

Description

分类层通过相互排斥类计算分类和加权分类任务的跨凝结损失。

该层从上一层的输出大小中输入类数。例如,指定类的数量Kof the network, you can include a fully connected layer with output sizeKand a softmax layer before the classification layer.

= classificationLayer创建分类层。

example

= classificationLayer(Name,Value)sets the optionalName,班级重量, 和Classesproperties using one or more name-value pairs. For example,classificationLayer('Name','output')创建一个名称的分类层'output'

Examples

collapse all

Create a classification layer with the name'output'

layer = ClassificationLayer('Name','output')
层= ClassificationOutputLayer with properties: Name: 'output' Classes: 'auto' ClassWeights: 'none' OutputSize: 'auto' Hyperparameters LossFunction: 'crossentropyex'

将分类输出层包含在array.

层s = [。。。imageInputLayer([28 28 1]) convolution2dLayer(5,20) reluLayer maxPooling2dLayer(2,'Stride',2) fullyConnectedLayer(10) softmaxLayer classificationLayer]
层= 7x1层阵列,带有层:1''图像输入28x28x1图像,带有“ zerecenter”归一化2''卷积20 5x5卷积[1 1]和填充[0 0 0 0 0 0] 3''relu 4'relu 4''max max合并2x2 max plies [2 2]和填充[0 0 0 0] 5''完全连接10完全连接的层6''softmax softmax 7''分类crossentropyex

Create a weighted classification layer for three classes with names "cat", "dog", and "fish", with weights 0.7, 0.2, and 0.1, respectively.

class = ["cat"“狗”"fish"]; classWeights = [0.7 0.2 0.1]; layer = classificationLayer(。。。“课”,课,。。。'ClassWeights',classWeights)
层= ClassificationOutputLayer with properties: Name: '' Classes: [cat dog fish] ClassWeights: [3x1 double] OutputSize: 3 Hyperparameters LossFunction: 'crossentropyex'

Include a weighted classification output layer in a Layer array.

numClasses = numel(classes); layers = [。。。imageInputLayer([28 28 1]) convolution2dLayer(5,20) reluLayer maxPooling2dLayer(2,'Stride',2) fullyConnectedLayer(numClasses) softmaxLayer classificationLayer(“课”,课,'ClassWeights',classWeights)]
层= 7x1层阵列,带有层:1''图像输入28x28x1图像,带有“ zerecenter”归一化2''卷积20 5x5卷积[1 1]和填充[0 0 0 0 0 0] 3''relu 4'relu 4''max max合并2x2 max池层次[2 2]和填充[0 0 0 0] 5''完全连接3完全连接的层6''softmax softmax 7''分类输出class类加权crossentropyex,带有“ cat”和2个类别

Input Arguments

collapse all

Name-Value Arguments

将可选的参数对Name1=Value1,...,NameN=ValueN, whereNameis the argument name and价值是相应的值。名称值参数必须在其他参数之后出现,但是对的顺序并不重要。

Before R2021a, use commas to separate each name and value, and encloseNamein quotes.

例子:classificationLayer('Name','output')创建一个名称的分类层'output'

图层名称,指定为字符向量或字符串标量。为了array input, thetrainNetwork,汇编工作,LayerGraph, 和dlnetwork功能自动将名称分配给具有名称的图层''

Data Types:char|string

加权跨透镜损失的班级权重, specified as a vector of positive numbers or'none'

对于向量类权重,每个元素代表相应类的权重Classes财产。要指定类权重的向量,您还必须使用“课”

If the班级重量属性是'none',然后该层施加未加权的横向渗透损失。

输出层的类,指定为分类向量,字符串数组,字符向量的单元格数或'auto'。IfClassesis'auto', then the software automatically sets the classes at training time. If you specify the string array or cell array of character vectorsstr,然后该软件将输出层的类设置为分类(str,str)

Data Types:char|分类|string|细胞

Output Arguments

collapse all

Classification layer, returned as aClassificationOutputLayer目的。

有关串联层构建卷积神经网络体系结构的信息,请参见

More About

collapse all

Classification Layer

分类层通过相互排斥类计算分类和加权分类任务的跨凝结损失。

为了typical classification networks, the classification layer usually follows a softmax layer. In the classification layer,trainNetworktakes the values from the softmax function and assigns each input to one of theKmutually exclusive classes using the cross entropy function for a 1-of-Kcoding scheme[1]:

loss = 1 N n = 1 N i = 1 K w i t n i Ln y n i ,

whereNis the number of samples,Kis the number of classes, w i is the weight for classi, t n i is the indicator that thenth sample belongs to thei班级, y n i 是样品的输出n上课i在这种情况下,这是SoftMax函数的值。换句话说, y n i 网络关联的概率是nth input with classi

参考

[1] Bishop,C。M.Pattern Recognition and Machine Learning。纽约州施普林格,纽约,2006年。

Extended Capabilities

C/C++ Code Generation
使用MATLAB®CODER™生成C和C ++代码。

GPU Code Generation
使用GPU CODER™为NVIDIA®GPU生成CUDA®代码。

Version History

Introduced in R2016a