Skip to content

Add warning about not supporting torch.nn.SyncBatchNorm #5046

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Feb 18, 2025

Conversation

pintaoz-aws
Copy link
Contributor

Issue #, if available:
#2571
Description of changes:
Add warning about not supporting torch.nn.SyncBatchNorm
Testing done:

Merge Checklist

Put an x in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your pull request.

General

  • I have read the CONTRIBUTING doc
  • I certify that the changes I am introducing will be backward compatible, and I have discussed concerns about this, if any, with the Python SDK team
  • I used the commit message format described in CONTRIBUTING
  • I have passed the region in to all S3 and STS clients that I've initialized as part of this change.
  • I have updated any necessary documentation, including READMEs and API docs (if appropriate)

Tests

  • I have added tests that prove my fix is effective or that my feature works (if appropriate)
  • I have added unit and/or integration tests as appropriate to ensure backward compatibility of the changes
  • I have checked that my tests are not configured for a specific region or account (if appropriate)
  • I have used unique_name_from_base to create resource names in integ tests (if appropriate)
  • If adding any dependency in requirements.txt files, I have spell checked and ensured they exist in PyPi

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

Comment on lines 378 to 379
Warning: ``torch.nn.SyncBatchNorm`` is not supported and its existence in
``init_process_group`` will cause an exception during distributed training.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make the wording more generic instead of just this use case ?

smdistributed with init_process_group will not allow use of certain other torch features, such as (and likely not limited to) torch.nn.SyncBatchNorm.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

@chad119 chad119 merged commit 4f19de5 into aws:master Feb 18, 2025
12 of 14 checks passed
evakravi pushed a commit to evakravi/sagemaker-python-sdk that referenced this pull request Mar 20, 2025
* Add warning about not supporting

* update wording

---------

Co-authored-by: pintaoz <[email protected]>
pravali96 pushed a commit to pravali96/sagemaker-python-sdk that referenced this pull request Apr 21, 2025
* Add warning about not supporting

* update wording

---------

Co-authored-by: pintaoz <[email protected]>
uyoldas pushed a commit to uyoldas/sagemaker-python-sdk that referenced this pull request May 23, 2025
* Add warning about not supporting

* update wording

---------

Co-authored-by: pintaoz <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants