File tree Expand file tree Collapse file tree 2 files changed +7
-3
lines changed Expand file tree Collapse file tree 2 files changed +7
-3
lines changed Original file line number Diff line number Diff line change @@ -132,10 +132,12 @@ We can also pass a model instance for the chat model and the embedding model. Fo
132
132
azure_deployment = " AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME" ,
133
133
openai_api_version = " AZURE_OPENAI_API_VERSION" ,
134
134
)
135
-
135
+ # Supposing model_tokens are 100K
136
+ model_tokens_count = 100000
136
137
graph_config = {
137
138
" llm" : {
138
- " model_instance" : llm_model_instance
139
+ " model_instance" : llm_model_instance,
140
+ " model_tokens" : model_tokens_count,
139
141
},
140
142
" embeddings" : {
141
143
" model_instance" : embedder_model_instance
@@ -191,4 +193,4 @@ We can also pass a model instance for the chat model and the embedding model. Fo
191
193
" embeddings" : {
192
194
" model_instance" : embedder_model_instance
193
195
}
194
- }
196
+ }
Original file line number Diff line number Diff line change 11
11
For example, if the user prompt is "What is the capital of France?",
12
12
you should return "capital of France". \n
13
13
If you return something else, you will get a really bad grade. \n
14
+ What you return should be sufficient to get the answer from the internet. \n
15
+ Don't just return a small part of the prompt, unless that is sufficient. \n
14
16
USER PROMPT: {user_prompt}"""
You can’t perform that action at this time.
0 commit comments