3 Promising Paths to improve Local Llamas #2013
nikshepsvn
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Full credit goes to @alain40 who posted this here: turboderp/exllama#92 (comment)

I wanted to raise the points he made there here as this is a larger forum to have this discussion and get more input, I've pasted a screenshot of his message for convinience:
Personally for me the third point is most interesting, if the base model is able to pick-up and use LoRa's in a tool-like fashion (like how we do for plugins in langchain now) we could have a base-model that can adapt responses based on various datasets in almost realtime, I'm going to dive deeper into this as well
Beta Was this translation helpful? Give feedback.
All reactions