Skip to content

Conversation

@guotong1988
Copy link

@guotong1988 guotong1988 commented May 2, 2018

get the result of https://arxiv.org/abs/1801.00076

The reason and inspiration to use bi-directional attention is from the machine comprehension task. In the SQuAD [Rajpurkar et al. 2016] machine comprehension task, we input the paragraph and question to the model and find the answer string in the paragraph. And in the SQL generation task, we input the question and columns and find the answer column in the columns. The two task are similar in this view.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant