Skip to content
This repository was archived by the owner on Jan 24, 2024. It is now read-only.

Comments

Chap 4 Word2Vec Chinese Version #545

Merged
daming-lu merged 10 commits intoPaddlePaddle:high-level-api-branchfrom
daming-lu:chap4_chinese
Jun 12, 2018
Merged

Chap 4 Word2Vec Chinese Version #545
daming-lu merged 10 commits intoPaddlePaddle:high-level-api-branchfrom
daming-lu:chap4_chinese

Conversation

@daming-lu
Copy link
Contributor

No description provided.

Copy link
Contributor

@jetfuel jetfuel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise LGTM

size=dict_size,
bias_attr=paddle.attr.Param(learning_rate=2),
act=paddle.activation.Softmax())
def inference_program(is_sparse):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The is_sparse is used every where, but not explained. It will be nice if we can briefly explain it.

```

然后,指定训练相关的参数:
- 现在我可以开始训练啦。如今的版本较之以前就简单了许多。我们有现成的训练和测试集:`paddle.dataset.imikolov.train()`和`paddle.dataset.imikolov.test()`。两者都会返回一个读取器。在PaddlePaddle中,读取器是一个Python的函数,每次调用,会读取下一条数据。它是一个Python的generator。
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

我 -> 我们


def event_handler(event):
if isinstance(event, fluid.EndStepEvent):
outs = trainer.test(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can put all the code under the if event.step % 10 == 0: block.

# Note here we need to choose more sophisticated optimizer
# such as AdaGrad with a decay rate. The normal SGD converges
# very slowly.
# optimizer=fluid.optimizer.SGD(learning_rate=0.001),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to update this comment.
or move the whole thing to optimizer_func

@daming-lu daming-lu merged commit b326f9f into PaddlePaddle:high-level-api-branch Jun 12, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants