You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With `ngl > 0` the code breaks. Probably because the Lora tensors try to interact with the base tensors (`lora_mul_mat`), but they are not moved to the buffer of the base tensors.
39
+
With `ngl > 0` the code breaks. Probably because the Lora tensors try to interact with the base tensors (as in`lora_mul_mat`), but the lora tensors are not moved to the gpu buffer of the base tensors.
40
40
41
41
# Logic
42
42
43
43
44
44
45
+
45
46
# Current status
46
47
47
-
- Only ony Lora adapter can be passed.
48
+
- Only one Lora adapter can be passed.
49
+
- Applying only adapter to Q, K, V matrices to keep the code contained (fintuning trained lora tensors for all linear layers)
0 commit comments