Skip to content

Add snippets to KerasNLP models #628

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 17, 2024
Merged

Add snippets to KerasNLP models #628

merged 2 commits into from
Apr 17, 2024

Conversation

Wauplin
Copy link
Contributor

@Wauplin Wauplin commented Apr 15, 2024

In #616 we added basic support for the keras-nlp library. This PR adds a code snippet for KerasNLP models. I just got confirmation from @mattdangerw that models can be loaded with:

import keras_nlp

tokenizer = keras_nlp.models.Tokenizer.from_preset("hf://${model.id}")
backbone = keras_nlp.models.Backbone.from_preset("hf://${model.id}")

We might want to do more fancy stuff in the future to parse the underlying model architecture server-side but in the meantime this is a valid solution.

Copy link
Member

@julien-c julien-c left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

note that parsing metadata.json or config.json to automagically set tags or snippets is valid too (instead of, or in complement to, storing stuff in the model card)

Copy link
Contributor

@osanseviero osanseviero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@Wauplin
Copy link
Contributor Author

Wauplin commented Apr 17, 2024

Thanks for the reviews!

@Wauplin Wauplin merged commit 57f6085 into main Apr 17, 2024
@Wauplin Wauplin deleted the better-keras-nlp-support branch April 17, 2024 09:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants