Replies: 2 comments 4 replies
-
Hi, to clarify, are you talking about using ./server? |
Beta Was this translation helpful? Give feedback.
2 replies
-
Currently, llama.cpp is designed to handle only one client at a time. At least the server example is not designed to handle multiple requests simultaneously. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
If I successfully created a website using this to power an AI chatbot, can other people on other computer use the ai on the website? They say this AI is installed "locally" and I am not sure what that really means.
Beta Was this translation helpful? Give feedback.
All reactions