Softmax activation function calculates probabilities of each target class over all possible target classes.
The values of the output vector are in range (0, 1) and sum to 1.
softmax
of
input x
is calculated
by function exp(x)/tf.reduce_sum(exp(x))
.
Refer below snippet to use softmax activation
with tf.keras.activations
.
import tensorflow as tf
input_softmax = tf.random.normal([1,5])
output_softmax = tf.keras.activations.softmax(input_softmax)
print("Input")
print(input_softmax)
print("Output after applying softmax activation")
print(output_softmax)
print("Sum of all elements of output vector, this should be 1 for softmax activation")
print(tf.reduce_sum(output_softmax))
Example output:
Input
tf.Tensor([[ 0.35618225 -0.7339882 -0.5491383 1.6861256 -1.0074842 ]], shape=(1, 5), dtype=float32)
Output after applying softmax activation
tf.Tensor([[0.17309654 0.058188 0.07000237 0.6544485 0.04426451]], shape=(1, 5), dtype=float32)
Sum of all elements of output vector, this should be 1 for softmax activation
tf.Tensor(0.99999994, shape=(), dtype=float32)
Similar Articles