-
Notifications
You must be signed in to change notification settings - Fork 1.9k
[Feature] Get actual formatted prompt #8259
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@JBExcoffier Have you tried using ChatAdapter.format() for this? |
To get only the supposed system prompt I am using But it only gives the following string :
For now I cannot access the actual messages that are sent to the language model, so encapsulated in a list with Indeed I don't want to pass few_shot examples so
It raises the following error : What's the correct way to pass the inputs please ? I already have the main part of the prompt with Thanks ! |
I'll let an author reply but I think this is what you want? It includes the full messages including the roles as an OpenAI-like list-dict. demos is []. It doesn't pass in an actual input.
Looking at your code, you may be confused about the difference between a ChatAdapter instance (which I use) and a ChatAdapter class (which you use). In case that help clarify why you got an error. |
Indeed, it is better when I use the class. With the following :
I get the following answer :
So it's fine ! But still not an actual template since I passed an input. Without the input (empty dict) I have an error, which is the same error I get when running your code : Is am using DSPy |
@JBExcoffier Thanks for reporting the issue! The formatted prompt is actually a multi-turn message, to get that you can use If you want to see a more detailed breakdown of what's happening behind the scene, please try out MLflow tracing which visualizes every step: https://dspy.ai/tutorials/observability/ |
What feature would you like to see?
How could we access the actual and formatted prompt used by DSPy ?
I am using
dspy
python package with2.6.15
version.For-example, with the following :
How could I get that the actual and formated prompt used by DSPy in the
Predict
(dspy.Predict(AnswerToQuestion)
) is something like :{'role': 'system', 'content': 'Your input fields are:\n1.
question(str): Question\n\nYour output fields are:\n1.
answer(str): Answer\n\nAll interactions will be structured in the following way, with the appropriate values filled in.\n\nInputs will have the following structure:\n\n[[ ## question ## ]]\n{question}\n\nOutputs will be a JSON object with the following fields.\n\n{\n "answer": "{answer}"\n}\n\nIn adhering to this structure, your objective is: \n Answer the question.'}
And this is incomplete since there is no the
{'role': 'system', 'content': 'My actual question ?'
with my actual question.I need it since I want to be able to use the generated prompt externally(so not by the DSPy
Predict
function alone), for example for benchmarking purpose.Is this already implemented ? Or does this need to be added ?
Thanks !
Would you like to contribute?
Additional Context
No response
The text was updated successfully, but these errors were encountered: