merge fluid dist tests#8573
Conversation
gongweibao
left a comment
There was a problem hiding this comment.
The demo code is too long, so can we reduce them?
| if is_local: | ||
| train_loop(fluid.default_main_program()) | ||
| else: | ||
| port = os.getenv("PADDLE_INIT_PORT", "6174") |
There was a problem hiding this comment.
It's hard code here.
There is a reference:
https://github.com/gongweibao/cloud/blob/2df615851d0191677e00b1027120baeffbc2174d/docker/paddle_k8s#L70
https://github.com/gongweibao/Paddle/blob/ffe213950d3b08eaf5f8119422662e554a9c7725/benchmark/cluster/vgg16/vgg16_fluid.py#L240
There was a problem hiding this comment.
Thought both ways are OK, PADDLE_INIT_PORT is also used in v2 when calling paddle.init()
| raise AssertionError("Fit a line cost is too large, {0:2.2}".format( | ||
| avg_loss_value[0])) | ||
|
|
||
| if is_local: |
There was a problem hiding this comment.
Is is_local necessary? We can run the local mode with the code under book directory.
There was a problem hiding this comment.
Got it, this PR merge book and book_dist folder.
Yancey0623
left a comment
There was a problem hiding this comment.
LGTM, please fix the conflict, it's better to merge this PR ASAP to make sure dist demo code is running correctly.
0c4714a
Fix #8571
Fix #8598