Main Content

trainSoftmaxLayer

Train a softmax layer for classification

Description

example

= trainSoftmaxLayer(XT)trains a softmax layer,,on the input dataXand the targetsT.

= trainSoftmaxLayer(XTName,Value)trains a softmax layer,,with additional options specified by one or more of theName,Valuepair arguments.

For example, you can specify the loss function.

Examples

collapse all

Load the sample data.

[X,T] = iris_dataset;

Xis a 4x150 matrix of four attributes of iris flowers: Sepal length, sepal width, petal length, petal width.

Tis a 3x150 matrix of associated class vectors defining which of the three classes each input is assigned to. Each row corresponds to a dummy variable representing one of the iris species (classes). In each column, a 1 in one of the three rows represents the class that particular sample (observation or example) belongs to. There is a zero in the rows for the other classes that the observation does not belong to.

Train a softmax layer using the sample data.

净= trainSoftmaxLayer(X,T);

Classify the observations into one of the three classes using the trained softmax layer.

Y = net(X);

Plot the confusion matrix using the targets and the classifications obtained from the softmax layer.

plotconfusion(T,Y);

Input Arguments

collapse all

Training data, specified as anm-by-nmatrix, wheremis the number of variables in training data, andnis the number of observations (examples). Hence, each column ofXrepresents a sample.

Data Types:single|double

Target data, specified as ak-by-nmatrix, wherekis the number of classes, andnis the number of observations. Each row is a dummy variable representing a particular class. In other words, each column represents a sample, and all entries of a column are zero except for a single one in a row. This single entry indicates the class for that sample.

Data Types:single|double

Name-Value Arguments

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN,whereNameis the argument name andValueis the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and encloseNamein quotes.

Example:“MaxEpochs”,400,'ShowProgressWindow',falsespecifies the maximum number of iterations as 400 and hides the training window.

Maximum number of training iterations, specified as the comma-separated pair consisting of“MaxEpochs”and a positive integer value.

Example:“MaxEpochs”,500

Data Types:single|double

Loss function for the softmax layer, specified as the comma-separated pair consisting of'LossFunction'and either'crossentropy'or'mse'.

msestands for mean squared error function, which is given by:

E = 1 n j = 1 n i = 1 k ( t i j y i j ) 2

wherenis the number of training examples, andkis the number of classes. t i j is theijth entry of the target matrix,T,and y i j is theith output from the autoencoder when the input vector isxj.

The cross entropy function is given by:

E = 1 n j = 1 n i = 1 k t i j ln y i j + ( 1 t i j ) ln ( 1 y i j ) .

Example:'LossFunction','mse'

Indicator to display the training window during training, specified as the comma-separated pair consisting of'ShowProgressWindow'and eithertrueorfalse.

Example:'ShowProgressWindow',false

Data Types:logical

培训algorithm used to train the softmax layer, specified as the comma-separated pair consisting of'TrainingAlgorithm'and“trainscg”,which stands for scaled conjugate gradient.

Example:'TrainingAlgorithm','trainscg'

Output Arguments

collapse all

Softmax layer for classification, returned as a净workobject. The softmax layer,,is the same size as the targetT.

Version History

Introduced in R2015b