Skip to content
This repository was archived by the owner on Jan 24, 2024. It is now read-only.

Comments

Updated content of the book#427

Merged
kavyasrinet merged 1 commit intoPaddlePaddle:developfrom
kavyasrinet:update-content
Sep 27, 2017
Merged

Updated content of the book#427
kavyasrinet merged 1 commit intoPaddlePaddle:developfrom
kavyasrinet:update-content

Conversation

@kavyasrinet
Copy link

This is for the MNIST chapter

Copy link
Contributor

@dzhwinter dzhwinter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

where $ \text{softmax}(x_i) = \frac{e^{x_i}}{\sum_j e^{x_j}} $

For an $N$-class classification problem with $N$ output nodes, Softmax normalizes the resulting $N$ dimensional vector so that each of its entries falls in the range $[0,1]\in\math{R}$, representing the probability that the sample belongs to a certain class. Here $y_i$ denotes the predicted probability that an image is of digit $i$.
For an $N$-class classification problem with $N$ output nodes, Softmax normalizes the resulting $N$ dimensional vector so that each of its entries falls in the range $[0,1]\in {R}$, representing the probability that the sample belongs to a certain class. Here $y_i$ denotes the predicted probability that an image is of digit $i$.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice catch! In online latex, it shows correctly, It But it really failed in the paddle book website.

@kavyasrinet kavyasrinet merged commit 4bcfe04 into PaddlePaddle:develop Sep 27, 2017
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants