You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
tensorflow/tensorflow#60314
The `tf.keras.activations.softmax` function, the `tf.keras.backend.softmax` function and the `tf.keras.layers.Softmax` layer now behave consistently and save the logits in `_keras_logits`. Previously, only the activation function had this behavior. This prevents the computation of the gradient of the crossentropy from underflowing.
The same fix was applied to the `tf.keras.backend.sigmoid` function and the `tf.keras.layers.Sigmoid` layer.
One behavior change is that `tf.keras.backend.softmax` and `tf.keras.layers.Softmax` no longer accept inputs of rank 1.
PiperOrigin-RevId: 536456175
0 commit comments