Skip to content

[Inference snippets]: no need to showcase max_tokens #1401

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 30, 2025

Conversation

Wauplin
Copy link
Contributor

@Wauplin Wauplin commented Apr 30, 2025

Originally by @julien-c / @gary149 on slack:

mais dans le snippet moi j'enlèverai compl!tement max_tokens, to be honest

=> let's remove max_tokens entirely from the inference snippets

Copy link
Contributor

@hanouticelina hanouticelina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@Wauplin Wauplin merged commit 75724b1 into main Apr 30, 2025
4 of 5 checks passed
@Wauplin Wauplin deleted the remove-max-tokens-from-snippets branch April 30, 2025 08:07
@julien-c
Copy link
Member

nice!

Deep-unlearning pushed a commit that referenced this pull request May 13, 2025
_Originally by @julien-c / @gary149 on slack:_

> mais dans le snippet moi j'enlèverai compl!tement max_tokens, to be
honest

=> let's remove `max_tokens` entirely from the inference snippets
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants