You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This example reads weights from project [llama2.c](https://github.com/karpathy/llama2.c) and saves them in ggml compatible format.
3
+
This example reads weights from project [llama2.c](https://github.com/karpathy/llama2.c) and saves them in ggml compatible format. The vocab that is available in `models/ggml-vocab.bin` is used by default.
4
4
5
5
To convert the model first download the models from the [llma2.c](https://github.com/karpathy/llama2.c) repository:
6
6
7
7
`$ make -j`
8
8
9
-
`$ ./convert-llama2c-to-ggml --vocab-model <ggml-vocab.bin> --llama2c-model <llama2.c model path> --llama2c-output-model <ggml output model path>`
9
+
After successful compilation, following usage options are available:
10
+
```
11
+
usage: ./convert-llama2c-to-ggml [options]
10
12
11
-
Now you can use the model with command:
13
+
options:
14
+
-h, --help show this help message and exit
15
+
--copy-vocab-from-model FNAME model path from which to copy vocab (default 'models/ggml-vocab.bin')
16
+
--llama2c-model FNAME [REQUIRED] model path from which to load Karpathy's llama2.c model
17
+
--llama2c-output-model FNAME model path to save the converted llama2.c model (default ak_llama_model.bin')
18
+
```
19
+
20
+
An example command is as follows:
21
+
22
+
`$ ./convert-llama2c-to-ggml --copy-vocab-from-model <ggml-vocab.bin> --llama2c-model <llama2.c model path> --llama2c-output-model <ggml output model path>`
23
+
24
+
Now you can use the model with command like:
12
25
13
26
`$ ./main -m <ggml output model path> -p "One day, Lily met a Shoggoth" -n 500 -c 256 -eps 1e-5`
0 commit comments