Skip to content

llava-cli: tokenize special tokens and remove its own special escape process #5382

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Feb 7, 2024

Conversation

jxy
Copy link
Contributor

@jxy jxy commented Feb 7, 2024

The 34B llava model uses special tokens <|im_start|> and <|im_end|>. There's no easy way to prompt it using the special tokens with llava-cli.

The second commit simply removes the extra escape process, which seems unnecessary.

@ggerganov ggerganov merged commit 0ef46da into ggml-org:master Feb 7, 2024
jordankanter pushed a commit to jordankanter/llama.cpp that referenced this pull request Mar 13, 2024
* llava-cli: tokenize special tokens in prompt

* llava-cli: use the escape CLI argument, remove incomplete separate escaping process
hodlen pushed a commit to hodlen/llama.cpp that referenced this pull request Apr 1, 2024
* llava-cli: tokenize special tokens in prompt

* llava-cli: use the escape CLI argument, remove incomplete separate escaping process
@jxy jxy deleted the llava-cli-fix branch April 10, 2024 02:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants