Skip to content

Fast non-model CLI commands #1349

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 7, 2024
Merged

Conversation

gabe-l-hart
Copy link
Contributor

Description

Closes: #1347

Import torch lazily in all places used by the CLI that don't need a model

These changes add a little complexity with the lazy and local imports, but they also greatly improve the CLI's response for --help, list, and where.

Changes:

  • Move import torch into function bodies that need them
  • Use importlib.metadata.version to check the torch version rather than torch.version
  • Switch from using torch.inference_mode as a decorator to using it as a context manager.
    • I also removed it from convert_hf_checkpoint_to_tune since that does not use torch at all
  • In build_utils, wrap the dtype values in lambdas so they're lazily fetched.

…t don't need a model

These changes add a little complexity with the lazy and local imports, but
they also greatly improve the CLI's response for --help, list, and where.

Changes:

* Move `import torch` into function bodies that need them
* Use `importlib.metadata.version` to check the torch version rather than
  torch.__version__
* Switch from using torch.inference_mode as a decorator to using it as a
  context manager.
  * I also removed it from convert_hf_checkpoint_to_tune since that does
    not use torch at all
* In build_utils, wrap the dtype values in lambdas so they're lazily
  fetched.

pytorch#1347
Branch: FasterCli-1347

Signed-off-by: Gabe Goodhart <[email protected]>
Copy link

pytorch-bot bot commented Nov 6, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchchat/1349

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 8ae1bc5 with merge base 4a7dab8 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 6, 2024
raise RuntimeError(
f"You are using PyTorch {torch.__version__}. At this time, torchchat uses the latest PyTorch technology with high-performance kernels only available in PyTorch nightly until the PyTorch 2.4 release"
f"You are using PyTorch {torch_version}. At this time, torchchat uses the latest PyTorch technology with high-performance kernels only available in PyTorch nightly until the PyTorch 2.4 release"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might defer that to commands that actually use torch, and then we can just use torch.version ? No point in raising a version issue when all we do is run help or similar non-model cli commands?

Wdyt @Jack-Khuu

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like having it here, since we can catch it at the very beginning as opposed to having version checking in every file.

Doing it this way will speed up the interface by a lot. Right now --help and --command is super slow. It also bogs down CI.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could see it either way, but I think I agree with @byjlw. I don't think the non-running commands are valuable in-and-of-themselves, so I think the argument would be that this is really a property of the tool (there is no torchchat without torch) and doing the lazy imports is just a speed optimization.

If there were standalone commands that didn't need torch (i.e. if people start using download just to fetch models, but then run them with something else), then I could see making this a soft requirement and deferring it to the commands that need it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually added what @mikekgfb is suggesting a while back, so this check being in arg_init actually does what all 3 y'all suggest (check only if needed, and check in one place)

torchchat/torchchat.py

Lines 62 to 63 in 4a7dab8

if args.command not in INVENTORY_VERBS:
args = arg_init(args)

arg_init is only called when it is pertinent

Copy link
Contributor

@Jack-Khuu Jack-Khuu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this @gabe-l-hart!!

It's been on my list for a while

raise RuntimeError(
f"You are using PyTorch {torch.__version__}. At this time, torchchat uses the latest PyTorch technology with high-performance kernels only available in PyTorch nightly until the PyTorch 2.4 release"
f"You are using PyTorch {torch_version}. At this time, torchchat uses the latest PyTorch technology with high-performance kernels only available in PyTorch nightly until the PyTorch 2.4 release"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually added what @mikekgfb is suggesting a while back, so this check being in arg_init actually does what all 3 y'all suggest (check only if needed, and check in one place)

torchchat/torchchat.py

Lines 62 to 63 in 4a7dab8

if args.command not in INVENTORY_VERBS:
args = arg_init(args)

arg_init is only called when it is pertinent

@Jack-Khuu Jack-Khuu merged commit 170581a into pytorch:main Nov 7, 2024
52 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Slow CLI --help (and other commands)
5 participants