-
Notifications
You must be signed in to change notification settings - Fork 941
v5 #3177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
v5 #3177
Conversation
merveenoyan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know it's a draft, just left some suggestions for clarity 🤗
merveenoyan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
more comments!
|
also smol reminder to add to _blog.yml 🙌🏻 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really nice blog post. My suggestions make the prose a bit punchier with simpler grammar.
Also, remember to add it to _blog.yml 🤗 .
pcuenca
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
First quick pass, looking great!
As a general comment, in the middle sections there are many paragraphs that start with "We have", "We are", which may feel a bit monotonous. We (lol) could slightly reword a few of these.
transformers-v5.md
Outdated
|
|
||
| This growth is linked to the growth of the field and the now mainstream access of AI, no doubt; as a leading model-definition library in the ecosystem, we need to continuously evolve and adapt the library to continue being relevant. Reinvention is key for longevity in AI. | ||
|
|
||
| We’re lucky to be working with a great number of libraries and apps working with transformers, in no specific order: llama.cpp, MLX, onnxruntime, Jan, LMStudio, vLLM, SGLang, Unsloth, LlamaFactory , dLLM, MaxText, TensorRT, Argmax, among many other friends. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the segue from fransformers being successful to talk about the libraries we work with might not be immediately apparent to readers that may not be following our previous comms. We could be explicit that one of the ingredients for success is our (aspirational) role to support and act as a reference for others. We could perhaps link to the previous post as well: https://huggingface.co/blog/transformers-model-definition.
|
|
||
| Transformers, at the core, remains a model architecture toolkit: we aim to have all recent architectures, and to act as the “source of truth” of such model definitions in the ecosystem. We’ve been adding between 1 and 3 new architectures to the toolkit every week over the past 5 years, as can be seen in the timeline below: | ||
|
|
||
| <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers_v5/transformers_model_timeline.png" alt="Transformers standardizing model definitions"> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Super nice! Will it render too small?
Co-authored-by: burtenshaw <[email protected]> Co-authored-by: Merve Noyan <[email protected]> Co-authored-by: Pedro Cuenca <[email protected]> Co-authored-by: Alvaro Bartolome <[email protected]>
Co-authored-by: burtenshaw <[email protected]> Co-authored-by: Merve Noyan <[email protected]>
Co-authored-by: Merve Noyan <[email protected]> Co-authored-by: Pedro Cuenca <[email protected]> Co-authored-by: Alvaro Bartolome <[email protected]>
|
|
||
| <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers_v5/transformers_model_timeline.png" alt="Transformers standardizing model definitions"> | ||
|
|
||
| [*https://huggingface.co/spaces/yonigozlan/Transformers-Timeline*](https://huggingface.co/spaces/yonigozlan/Transformers-Timeline) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why not embed the Space?
|
need turkey branding/logo for this? |
I need to update the transformers thumbnail and an image in the blog post
Congratulations! You've made it this far! Once merged, the article will appear at https://huggingface.co/blog. Official articles
require additional reviews. Alternatively, you can write a community article following the process here.
Preparing the Article
You're not quite done yet, though. Please make sure to follow this process (as documented here):
mdfile. You can also specifyguestororgfor the authors.Here is an example of a complete PR: #2382
Getting a Review
Please make sure to get a review from someone on your team or a co-author.
Once this is done and once all the steps above are completed, you should be able to merge.
There is no need for additional reviews if you and your co-authors are happy and meet all of the above.
Feel free to add @pcuenca as a reviewer if you want a final check. Keep in mind he'll be biased toward light reviews
(e.g., check for proper metadata) rather than content reviews unless explicitly asked.