Skip to content

Commit c471871

Browse files
authored
make n_gpu_layers=-1 offload all layers
1 parent d018c7b commit c471871

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

llama_cpp/llama.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -268,7 +268,7 @@ def __init__(
268268

269269
self.params = llama_cpp.llama_context_default_params()
270270
self.params.n_ctx = n_ctx
271-
self.params.n_gpu_layers = n_gpu_layers
271+
self.params.n_gpu_layers = 0x7FFFFFFF if n_gpu_layers == -1 else n_gpu_layers # 0x7FFFFFFF is INT32 max, will be auto set to all layers
272272
self.params.seed = seed
273273
self.params.f16_kv = f16_kv
274274
self.params.logits_all = logits_all

0 commit comments

Comments
 (0)