Skip to content

Commit 1f5ec2c

Browse files
committed
Updating two small main references missed earlier in the finetune docs.
1 parent 82df7f9 commit 1f5ec2c

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

examples/finetune/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,9 @@ After 10 more iterations:
3838
Checkpoint files (`--checkpoint-in FN`, `--checkpoint-out FN`) store the training process. When the input checkpoint file does not exist, it will begin finetuning a new randomly initialized adapter.
3939

4040
llama.cpp compatible LORA adapters will be saved with filename specified by `--lora-out FN`.
41-
These LORA adapters can then be used by `main` together with the base model, like in the 'predict' example command above.
41+
These LORA adapters can then be used by `llama-cli` together with the base model, like in the 'predict' example command above.
4242

43-
In `main` you can also load multiple LORA adapters, which will then be mixed together.
43+
In `llama-cli` you can also load multiple LORA adapters, which will then be mixed together.
4444

4545
For example if you have two LORA adapters `lora-open-llama-3b-v2-q8_0-shakespeare-LATEST.bin` and `lora-open-llama-3b-v2-q8_0-bible-LATEST.bin`, you can mix them together like this:
4646

0 commit comments

Comments
 (0)