This repository was archived by the owner on Apr 3, 2024. It is now read-only.
Fix various "bad" prompt-templates that potentially lead to odd results (qa-with-sources, qa) #34
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
I was loading qa-with-sources prompt templates from LangChainHub when I noticed I was getting odd answers. I tracked it down and it looks like some prompt templates were added in incorrectly and included an entire answer rather than just the template.
This PR replaces the corrupted prompt templates with what seemed like the proper template.
Support in testing or letting me know the easiest way to test this would be appreciated. From my brief overview of how prompt template loading works - I assume this should fix some issues but I have not actually tested it.
Issues reported in the langchain repository that could potentially be related:
Hallucinating Question about Michael Jackson #2510
RuntimeError: Failed to tokenize (LlamaCpp and QAWithSourcesChain) #2645
Issue with VectorDBQAWithSourcesChain and chain_type="stuff" #1326