You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+23-23Lines changed: 23 additions & 23 deletions
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ Torchchat is an easy-to-use library for running large language models (LLMs) on
16
16
17
17
## Quick Start
18
18
### Initialize the Environment
19
-
The following steps requires you have [Python 3.10](https://www.python.org/downloads/release/python-3100/) installed
19
+
The following steps require that you have [Python 3.10](https://www.python.org/downloads/release/python-3100/) installed.
20
20
21
21
```
22
22
# get the code
@@ -31,20 +31,20 @@ source .venv/bin/activate
31
31
./install_requirements.sh
32
32
33
33
# ensure everything installed correctly
34
-
python torchchat.py --help
34
+
python3 torchchat.py --help
35
35
36
36
```
37
37
38
38
### Generating Text
39
39
40
40
```
41
-
python torchchat.py generate stories15M
41
+
python3 torchchat.py generate stories15M
42
42
```
43
43
That’s all there is to it!
44
44
Read on to learn how to use the full power of torchchat.
45
45
46
46
## Customization
47
-
For the full details on all commands and parameters run `python torchchat.py --help`
47
+
For the full details on all commands and parameters run `python3 torchchat.py --help`
48
48
49
49
### Download
50
50
For supported models, torchchat can download model weights. Most models use HuggingFace as the distribution channel, so you will need to create a HuggingFace
@@ -54,46 +54,46 @@ To install `huggingface-cli`, run `pip install huggingface-cli`. After installin
54
54
HuggingFace.
55
55
56
56
```
57
-
python torchchat.py download llama3
57
+
python3 torchchat.py download llama3
58
58
```
59
59
60
60
### Chat
61
61
Designed for interactive and conversational use.
62
62
In chat mode, the LLM engages in a back-and-forth dialogue with the user. It responds to queries, participates in discussions, provides explanations, and can adapt to the flow of conversation.
63
63
64
-
For more information run `python torchchat.py chat --help`
64
+
For more information run `python3 torchchat.py chat --help`
65
65
66
66
**Examples**
67
67
```
68
-
python torchchat.py chat llama3 --tiktoken
68
+
python3 torchchat.py chat llama3 --tiktoken
69
69
```
70
70
71
71
### Generate
72
72
Aimed at producing content based on specific prompts or instructions.
73
73
In generate mode, the LLM focuses on creating text based on a detailed prompt or instruction. This mode is often used for generating written content like articles, stories, reports, or even creative writing like poetry.
74
74
75
-
For more information run `python torchchat.py generate --help`
75
+
For more information run `python3 torchchat.py generate --help`
*Running on http://127.0.0.1:5000* should be printed out on the terminal. Click the link or go to [http://127.0.0.1:5000](http://127.0.0.1:5000) on your browser to start interacting with it.
@@ -112,19 +112,19 @@ Enter some text in the input box, then hit the enter key or click the “SEND”
112
112
### Eval
113
113
Uses lm_eval library to evaluate model accuracy on a variety of tasks. Defaults to wikitext and can be manually controlled using the tasks and limit args.
114
114
115
-
For more information run `python torchchat.py eval --help`
115
+
For more information run `python3 torchchat.py eval --help`
0 commit comments