Skip to content
This repository was archived by the owner on Jan 24, 2024. It is now read-only.

Commit 1fe401e

Browse files
authored
Merge pull request #382 from helinwang/end-to-end
Update the tutorial of the end-to-end demo
2 parents db2abef + e23bc32 commit 1fe401e

File tree

1 file changed

+57
-10
lines changed

1 file changed

+57
-10
lines changed

mnist-client/README.md

Lines changed: 57 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,67 @@
11
# MNIST classification by PaddlePaddle
22

3-
Forked from https://github.com/sugyan/tensorflow-mnist
4-
53
![screencast](https://cloud.githubusercontent.com/assets/80381/11339453/f04f885e-923c-11e5-8845-33c16978c54d.gif)
64

7-
## Build
5+
## Usage
86

9-
$ docker build -t paddle-mnist .
7+
This MNIST classification demo consists of two parts: a PaddlePaddle
8+
inference server and a Javascript front end. We will start them
9+
separately.
1010

11-
## Usage
11+
We will use Docker to run the demo, if you are not familiar with
12+
Docker, please checkout
13+
this
14+
[tutorial](https://github.com/PaddlePaddle/Paddle/wiki/TLDR-for-new-docker-user).
15+
16+
### Start the Inference Server
17+
18+
The inference server can be used to inference any model trained by
19+
PaddlePaddle. Please see [here](TODO) for more details.
20+
21+
1. Download the MNIST inference model topylogy and parameters to the
22+
current working directory.
23+
24+
```bash
25+
wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/inference_topology.pkl
26+
wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/param.tar
27+
```
28+
29+
1. Run following command to start the inference server:
30+
31+
```bash
32+
docker run --name paddle_serve -v `pwd`:/data -d -p 8000:80 -e WITH_GPU=0 paddlepaddle/book:serve
33+
```
34+
35+
The above command will mount the current working directory to the
36+
`/data` directory inside the docker container. The inference
37+
server will load the model topology and parameters that we just
38+
downloaded from there.
1239

40+
After you are done with the demo, you can run `docker stop
41+
paddle_serve` to stop this docker container.
42+
43+
### Start the Front End
44+
45+
1. Run the following command
46+
```bash
47+
docker run -it -p 5000:5000 paddlepaddle/book:mnist
48+
```
49+
50+
1. Visit http://localhost:5000 and you will see the PaddlePaddle MNIST demo.
51+
52+
53+
## Build
54+
55+
We have already prepared the pre-built docker image
56+
`paddlepaddle/book:mnist`, here is the command if you want build the
57+
docker image again.
1358

14-
1. Download `inference_topology.pkl` and `param.tar` to current directory
15-
1. Run following commands:
1659
```bash
17-
docker run -v `pwd`:/data -d -p 8000:80 -e WITH_GPU=0 paddlepaddle/book:serve
18-
docker run -it -p 5000:5000 paddlepaddle/book:mnist
60+
docker build -t paddlepaddle/book:mnist .
1961
```
20-
1. Visit http://localhost:5000
62+
63+
64+
## Acknowledgement
65+
66+
Thanks to the great project https://github.com/sugyan/tensorflow-mnist
67+
. Most of the code in this project comes from there.

0 commit comments

Comments
 (0)