We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 210eb9e commit b81ba1fCopy full SHA for b81ba1f
examples/finetune/README.md
@@ -87,4 +87,4 @@ The LORA rank can be configured for each model tensor type separately with these
87
88
The LORA rank of 'norm' tensors should always be 1.
89
90
-To see all available options use `finetune --help`.
+To see all available options use `llama-finetune --help`.
0 commit comments