-
Notifications
You must be signed in to change notification settings - Fork 267
Fix a version check of pytorch when fsdp plugin is used #1777
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
The version checks for pytorch when psdp plugin is used has a format issue. This causes an error when fsdp plugins are enabled. The fix is in line with the original accelerate code. https://github.com/huggingface/accelerate/blob/526925b48c07d997cdd9bf5911f659ca45778915/src/accelerate/accelerator.py#L364
|
Please run through the tests for fsdp - RUN_SLOW=true GAUDI2_CI=1 pytest tests/test_fsdp_examples.py -v -s |
| import importlib.metadata | ||
|
|
||
| torch_version = importlib.metadata.version("torch") | ||
| torch_version = torch_version[5:] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if we don't need torch_version anymore, these lines setting torch_version could be deleted?
|
@vivekgoe , Could you do a quick review of this? |
|
@emascarenhas Since check ensures that we are running pytorch version >= 2.1.0 which is quite old, therefore in my opinion it should be ok to remove the check completely. If we go down this path of removing FSDP version check, please also remove it from other places in optimum-habana (e.g. optimum-habana-fork/optimum/habana/accelerate/utils/other.py). |
@jojivk73 , Per @vivekgoe suggestions, please review the PR for additional changes to be consistent for FSDP across OH. I think it's ok to retain the check as we do not want an older pytorch to be used or in the future maybe we enforce a newer version of pytorch. |
|
@jojivk73 , Any update here? If not needed anymore we can close it? |
|
@emascarenhas, This is still needed to use 1.19 with FSDP plugin. I will get back on this soon. |
@jojivk73 , Any update here? Please put in "Draft" state if not ready. |
What does this PR do?
Fixes # (issue)
The version checks for pytorch when psdp plugin is used has a format issue. This causes an error when fsdp plugins are enabled. The fix is in line with the original accelerate code.
https://github.com/huggingface/accelerate/blob/526925b48c07d997cdd9bf5911f659ca45778915/src/accelerate/accelerator.py#L364
Before submitting
fsdp plugins fail