-
Notifications
You must be signed in to change notification settings - Fork 434
Update conversational inference API snippets #976
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
ready for review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @mishig25! Results looks amazing and will definitely settle the debate over on which snippets to display 😄 As mentioned on the moon-landing PR, I would tend to always return InferenceSnippet []
to avoid confusions but if you think otherwise, I'm really fine keeping as it is now.
Once this is merged, I can take care of adding InferenceClient snippets for many tasks in Python (similar to #971 but with your new structure).
Note: this change will break the script that generates the Inference API docs (here). I'll open a PR once this is merged/deployed.
are the selected tab + sub-tab sticky? (not sure if they should be, cc @gary149) |
Currently no (I think it's fine) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! I didn't try everything but I see an issue with some snippets like
http://localhost:5564/black-forest-labs/FLUX.1-dev?inference_api=true
Thanks to #976, now we can show `hf_hub`, `oai` snippets for VLMs ("conversational image-text-to-text" models).
Description
This PR updates inference snippet generating functions signatures. Before, the functions were generating
string
. Now, the function will generateInferenceSnippet | InferenceSnippet []
.huggingface.js/packages/tasks/src/snippets/types.ts
Lines 14 to 17 in 5bc694b
Why do we need to generate
InferenceSnippet []
, not justInferenceSnippet
?Because for a given langauge (let's say), we wanna show multiple clients options (
huggingface_hub
,openai
). (see the attached video below).Also, this PR improves the conversational snippet greatly.
Screen recording
Screen.Recording.2024-10-22.at.11.32.06.mov