You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/finetune/README.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -38,9 +38,9 @@ After 10 more iterations:
38
38
Checkpoint files (`--checkpoint-in FN`, `--checkpoint-out FN`) store the training process. When the input checkpoint file does not exist, it will begin finetuning a new randomly initialized adapter.
39
39
40
40
llama.cpp compatible LORA adapters will be saved with filename specified by `--lora-out FN`.
41
-
These LORA adapters can then be used by `main` together with the base model, like in the 'predict' example command above.
41
+
These LORA adapters can then be used by `llama-cli` together with the base model, like in the 'predict' example command above.
42
42
43
-
In `main` you can also load multiple LORA adapters, which will then be mixed together.
43
+
In `llama-cli` you can also load multiple LORA adapters, which will then be mixed together.
44
44
45
45
For example if you have two LORA adapters `lora-open-llama-3b-v2-q8_0-shakespeare-LATEST.bin` and `lora-open-llama-3b-v2-q8_0-bible-LATEST.bin`, you can mix them together like this:
0 commit comments