-
Notifications
You must be signed in to change notification settings - Fork 437
Added ALBERT Model #14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Add ALBERT
Add ALBERT
|
Any other models you want to make note of? Also is it possible to add a description/comparison between each model? |
|
At the moment, there are many ALBERT models in SQuAD, so there is no other model I would like to add. |
|
What about Reformer from https://www.youtube.com/watch?v=rNG_hpSyZcE or https://ai.googleblog.com/2020/01/reformer-efficient-transformer.html ? |
|
Thanks for the suggestion. I understand that Reformer is a model that learns with less memory. Reformer can learn with less memory than Transformer, so it may be a basic technology for future learning models. It is important that languages without spaces such as Chinese and Japanese have high accuracy. |
|
SO what about the others in https://gluebenchmark.com/leaderboard and https://rajpurkar.github.io/SQuAD-explorer/ |
|
No. |
|
@artisanbaggio I was referring to other entries in the benchmark, not Reformer. |
|
OK, The model I am interested in in SQuAD and GLUE is the "T5" model as the first answer. |
Added ALBERT Model in ppt and jpeg files.