Skip to content

Conversation

@pyup-bot
Copy link
Collaborator

This PR updates transformers from 4.51.1 to 4.51.2.

Changelog

4.51.2

This is another round of bug fixes, but they are a lot more minor and outputs were not really affected!

- Fix Llama4 offset (37414) by Cyrilvallez 
- Attention Quantization with FBGemm & TP (37384)  by MekkCyber 
- use rms_norm_eps for the L2Norm for Llama4 (37418) by danielhanchen 
- mark llama4 as not supported with fa2 (37416) by winglian
Links

@stephenhky stephenhky merged commit 1bcf687 into master Apr 10, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants