SoftmaxLayer - Maple Help
For the best experience, we recommend viewing online help using Google Chrome or Microsoft Edge.

# Online Help

###### All Products    Maple    MapleSim

DeepLearning

 SoftmaxLayer
 softmax layer

 Calling Sequence SoftmaxLayer(dim, opts)

Parameters

 dim - positive integer opts - one or more options as specified below

Options

 • inputshape : list of integers or the symbol auto
 Shape of the input Tensor, not including the batch axis.
 With the default value auto, the shape is inferred. If inference is not possible, an error is issued.
 This option need only be specified when this layer is the first in a Sequential model.

Description

 • SoftmaxLayer(dim, opts) creates a softmax neural network layer. The input dim specifies the (zero-based) dimension of the along which the softmax normalization is applied.
 • This function is part of the DeepLearning package, so it can be used in the short form SoftmaxLayer(..) only after executing the command with(DeepLearning). However, it can always be accessed through the long form of the command by using DeepLearning[SoftmaxLayer](..).

Details

 • The implementation of SoftmaxLayer uses tf.keras.layers.Softmax from the TensorFlow Python API. Consult the TensorFlow Python API documentation for tf.keras.layers.Softmax for more information.

Examples

 > $\mathrm{with}\left(\mathrm{DeepLearning}\right)$
 $\left[{\mathrm{AddMultiple}}{,}{\mathrm{ApplyOperation}}{,}{\mathrm{BatchNormalizationLayer}}{,}{\mathrm{BidirectionalLayer}}{,}{\mathrm{BucketizedColumn}}{,}{\mathrm{CategoricalColumn}}{,}{\mathrm{Classify}}{,}{\mathrm{Concatenate}}{,}{\mathrm{Constant}}{,}{\mathrm{ConvolutionLayer}}{,}{\mathrm{DNNClassifier}}{,}{\mathrm{DNNLinearCombinedClassifier}}{,}{\mathrm{DNNLinearCombinedRegressor}}{,}{\mathrm{DNNRegressor}}{,}{\mathrm{Dataset}}{,}{\mathrm{DenseLayer}}{,}{\mathrm{DropoutLayer}}{,}{\mathrm{EinsteinSummation}}{,}{\mathrm{EmbeddingLayer}}{,}{\mathrm{Estimator}}{,}{\mathrm{FeatureColumn}}{,}{\mathrm{Fill}}{,}{\mathrm{FlattenLayer}}{,}{\mathrm{GRULayer}}{,}{\mathrm{GatedRecurrentUnitLayer}}{,}{\mathrm{GetDefaultGraph}}{,}{\mathrm{GetDefaultSession}}{,}{\mathrm{GetEagerExecution}}{,}{\mathrm{GetVariable}}{,}{\mathrm{GradientTape}}{,}{\mathrm{IdentityMatrix}}{,}{\mathrm{LSTMLayer}}{,}{\mathrm{Layer}}{,}{\mathrm{LinearClassifier}}{,}{\mathrm{LinearRegressor}}{,}{\mathrm{LongShortTermMemoryLayer}}{,}{\mathrm{MaxPoolingLayer}}{,}{\mathrm{Model}}{,}{\mathrm{NumericColumn}}{,}{\mathrm{OneHot}}{,}{\mathrm{Ones}}{,}{\mathrm{Operation}}{,}{\mathrm{Optimizer}}{,}{\mathrm{Placeholder}}{,}{\mathrm{RandomTensor}}{,}{\mathrm{ResetDefaultGraph}}{,}{\mathrm{Restore}}{,}{\mathrm{Save}}{,}{\mathrm{Sequential}}{,}{\mathrm{Session}}{,}{\mathrm{SetEagerExecution}}{,}{\mathrm{SetRandomSeed}}{,}{\mathrm{SoftMaxLayer}}{,}{\mathrm{SoftmaxLayer}}{,}{\mathrm{Tensor}}{,}{\mathrm{Variable}}{,}{\mathrm{Variables}}{,}{\mathrm{VariablesInitializer}}{,}{\mathrm{Zeros}}\right]$ (1)
 > $\mathrm{model}≔\mathrm{Sequential}\left(\left[\mathrm{SoftmaxLayer}\left(1\right)\right]\right)$
 ${\mathrm{model}}{≔}\left[\begin{array}{c}{\mathrm{DeepLearning Model}}\\ {\mathrm{}}\end{array}\right]$ (2)
 > $\mathrm{model}:-\mathrm{Compile}\left(\right)$

Compatibility

 • The DeepLearning[SoftmaxLayer] command was introduced in Maple 2022.
 • For more information on Maple 2022 changes, see Updates in Maple 2022.

 See Also