|
1 | 1 | # MNIST classification by PaddlePaddle |
2 | 2 |
|
3 | | -Forked from https://github.com/sugyan/tensorflow-mnist |
4 | | - |
5 | 3 |  |
6 | 4 |
|
7 | | -## Build |
| 5 | +## Usage |
8 | 6 |
|
9 | | - $ docker build -t paddle-mnist . |
| 7 | +This MNIST classification demo consists of two parts: a PaddlePaddle |
| 8 | +inference server and a Javascript front end. We will start them |
| 9 | +separately. |
10 | 10 |
|
11 | | -## Usage |
| 11 | +We will use Docker to run the demo, if you are not familiar with |
| 12 | +Docker, please checkout |
| 13 | +this |
| 14 | +[tutorial](https://github.com/PaddlePaddle/Paddle/wiki/TLDR-for-new-docker-user). |
| 15 | + |
| 16 | +### Start the Inference Server |
| 17 | + |
| 18 | +The inference server can be used to inference any model trained by |
| 19 | +PaddlePaddle. Please see [here](TODO) for more details. |
| 20 | + |
| 21 | +1. Download the MNIST inference model topylogy and parameters to the |
| 22 | + current working directory. |
| 23 | + |
| 24 | + ```bash |
| 25 | + wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/inference_topology.pkl |
| 26 | + wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/param.tar |
| 27 | + ``` |
| 28 | + |
| 29 | +1. Run following command to start the inference server: |
| 30 | + |
| 31 | + ```bash |
| 32 | + docker run --name paddle_serve -v `pwd`:/data -d -p 8000:80 -e WITH_GPU=0 paddlepaddle/book:serve |
| 33 | + ``` |
| 34 | + |
| 35 | + The above command will mount the current working directory to the |
| 36 | + `/data` directory inside the docker container. The inference |
| 37 | + server will load the model topology and parameters that we just |
| 38 | + downloaded from there. |
12 | 39 |
|
| 40 | + After you are done with the demo, you can run `docker stop |
| 41 | + paddle_serve` to stop this docker container. |
| 42 | + |
| 43 | +### Start the Front End |
| 44 | + |
| 45 | +1. Run the following command |
| 46 | + ```bash |
| 47 | + docker run -it -p 5000:5000 paddlepaddle/book:mnist |
| 48 | + ``` |
| 49 | + |
| 50 | +1. Visit http://localhost:5000 and you will see the PaddlePaddle MNIST demo. |
| 51 | + |
| 52 | + |
| 53 | +## Build |
| 54 | + |
| 55 | +We have already prepared the pre-built docker image |
| 56 | +`paddlepaddle/book:mnist`, here is the command if you want build the |
| 57 | +docker image again. |
13 | 58 |
|
14 | | -1. Download `inference_topology.pkl` and `param.tar` to current directory |
15 | | -1. Run following commands: |
16 | 59 | ```bash |
17 | | -docker run -v `pwd`:/data -d -p 8000:80 -e WITH_GPU=0 paddlepaddle/book:serve |
18 | | -docker run -it -p 5000:5000 paddlepaddle/book:mnist |
| 60 | +docker build -t paddlepaddle/book:mnist . |
19 | 61 | ``` |
20 | | -1. Visit http://localhost:5000 |
| 62 | + |
| 63 | + |
| 64 | +## Acknowledgement |
| 65 | + |
| 66 | +Thanks to the great project https://github.com/sugyan/tensorflow-mnist |
| 67 | +. Most of the code in this project comes from there. |
0 commit comments