Skip to content

Commit 109123c

Browse files
committed
docs: Use pymdownx.snippets for easier docs management
1 parent 2787663 commit 109123c

File tree

3 files changed

+9
-94
lines changed

3 files changed

+9
-94
lines changed

docs/changelog.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
-8<- "CHANGELOG.md"

docs/index.md

Lines changed: 4 additions & 94 deletions
Original file line numberDiff line numberDiff line change
@@ -1,95 +1,5 @@
1-
# Getting Started
1+
---
2+
title: Getting Started
3+
---
24

3-
## 🦙 Python Bindings for `llama.cpp`
4-
5-
[![Documentation](https://img.shields.io/badge/docs-passing-green.svg)](https://abetlen.github.io/llama-cpp-python)
6-
[![Tests](https://github.com/abetlen/llama-cpp-python/actions/workflows/test.yaml/badge.svg?branch=main)](https://github.com/abetlen/llama-cpp-python/actions/workflows/test.yaml)
7-
[![PyPI](https://img.shields.io/pypi/v/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
8-
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
9-
[![PyPI - License](https://img.shields.io/pypi/l/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
10-
[![PyPI - Downloads](https://img.shields.io/pypi/dm/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
11-
12-
Simple Python bindings for **@ggerganov's** [`llama.cpp`](https://github.com/ggerganov/llama.cpp) library.
13-
This package provides:
14-
15-
- Low-level access to C API via `ctypes` interface.
16-
- High-level Python API for text completion
17-
- OpenAI-like API
18-
- LangChain compatibility
19-
20-
## Installation
21-
22-
Install from PyPI:
23-
24-
```bash
25-
pip install llama-cpp-python
26-
```
27-
28-
## High-level API
29-
30-
```python
31-
>>> from llama_cpp import Llama
32-
>>> llm = Llama(model_path="./models/7B/ggml-model.bin")
33-
>>> output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True)
34-
>>> print(output)
35-
{
36-
"id": "cmpl-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
37-
"object": "text_completion",
38-
"created": 1679561337,
39-
"model": "./models/7B/ggml-model.bin",
40-
"choices": [
41-
{
42-
"text": "Q: Name the planets in the solar system? A: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune and Pluto.",
43-
"index": 0,
44-
"logprobs": None,
45-
"finish_reason": "stop"
46-
}
47-
],
48-
"usage": {
49-
"prompt_tokens": 14,
50-
"completion_tokens": 28,
51-
"total_tokens": 42
52-
}
53-
}
54-
```
55-
56-
## Web Server
57-
58-
`llama-cpp-python` offers a web server which aims to act as a drop-in replacement for the OpenAI API.
59-
This allows you to use llama.cpp compatible models with any OpenAI compatible client (language libraries, services, etc).
60-
61-
To install the server package and get started:
62-
63-
```bash
64-
pip install llama-cpp-python[server]
65-
export MODEL=./models/7B/ggml-model.bin
66-
python3 -m llama_cpp.server
67-
```
68-
69-
Navigate to [http://localhost:8000/docs](http://localhost:8000/docs) to see the OpenAPI documentation.
70-
71-
## Low-level API
72-
73-
The low-level API is a direct `ctypes` binding to the C API provided by `llama.cpp`.
74-
The entire API can be found in [llama_cpp/llama_cpp.py](https://github.com/abetlen/llama-cpp-python/blob/master/llama_cpp/llama_cpp.py) and should mirror [llama.h](https://github.com/ggerganov/llama.cpp/blob/master/llama.h).
75-
76-
77-
## Development
78-
79-
This package is under active development and I welcome any contributions.
80-
81-
To get started, clone the repository and install the package in development mode:
82-
83-
```bash
84-
git clone [email protected]:abetlen/llama-cpp-python.git
85-
cd llama-cpp-python
86-
git submodule update --init --recursive
87-
# Will need to be re-run any time vendor/llama.cpp is updated
88-
89-
pip install --upgrade pip
90-
pip install -e .[all]
91-
```
92-
93-
## License
94-
95-
This project is licensed under the terms of the MIT license.
5+
-8<- "README.md"

mkdocs.yml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,5 +17,9 @@ markdown_extensions:
1717
line_spans: __span
1818
pygments_lang_class: true
1919
- pymdownx.inlinehilite
20+
- pymdownx.magiclink:
21+
repo_url_shorthand: true
22+
user: abetlen
23+
repo: llama-cpp-python
2024
- pymdownx.snippets
2125
- pymdownx.superfences

0 commit comments

Comments
 (0)