We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 039f1ce commit 77cfcb4Copy full SHA for 77cfcb4
examples/finetune/README.md
@@ -80,9 +80,9 @@ The LORA rank can be configured for each model tensor type separately with these
80
--rank-wk N LORA rank for wk tensor (default 4)
81
--rank-wv N LORA rank for wv tensor (default 4)
82
--rank-wo N LORA rank for wo tensor (default 4)
83
- --rank-w1 N LORA rank for w1 tensor (default 4)
84
- --rank-w2 N LORA rank for w2 tensor (default 4)
85
- --rank-w3 N LORA rank for w3 tensor (default 4)
+ --rank-ffn_gate N LORA rank for ffn_gate tensor (default 4)
+ --rank-ffn_down N LORA rank for ffn_down tensor (default 4)
+ --rank-ffn_up N LORA rank for ffn_up tensor (default 4)
86
```
87
88
The LORA rank of 'norm' tensors should always be 1.
0 commit comments