Skip to content

Added CustomOpenaiCallback to ensure exclusive access to nested data. #670

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

LorenzoPaleari
Copy link
Contributor

See #576 for a detailed explanation.

Copy link
Collaborator

@VinciGit00 VinciGit00 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you it looks good, do you have idea of how to implement it for the other providers? I really would like to have it for Mistral and bedrock

@VinciGit00 VinciGit00 merged commit d7afdb1 into ScrapeGraphAI:pre/beta Sep 14, 2024
Copy link

🎉 This PR is included in version 1.19.0-beta.12 🎉

The release is available on:

Your semantic-release bot 📦🚀

Copy link

🎉 This PR is included in version 1.20.0-beta.1 🎉

The release is available on:

Your semantic-release bot 📦🚀

@LorenzoPaleari
Copy link
Contributor Author

Thank you it looks good, do you have idea of how to implement it for the other providers? I really would like to have it for Mistral and bedrock

I have taken a look at it.
For my understanding it is already working for all LLMs just by using openai_callback, the only downside is that it is not able to tell the cost of models different from OpenAi one.

I tested with Mistral and outputted the number of tokens used correctly, whiteout the price.

To add price we can mimic OpenAICallback class, just adding models names and cost should suffice to make the callback work in any situation.
https://api.python.langchain.com/en/latest/_modules/langchain_community/callbacks/openai_info.html#OpenAICallbackHandler

@VinciGit00
Copy link
Collaborator

Ok thx, is it ok if you implement it?

@LorenzoPaleari
Copy link
Contributor Author

@VinciGit00 Yes, I can do that, will probably do at the end of the week.
Will also try playing around to give a more detailed view of what happens under GraphIterator, to give a more detailed view of where all this used tokens goes. I'm not sure if I can end up with a good result without interfering with function calls, will let you know

@VinciGit00
Copy link
Collaborator

Ok thank you so much for the effort

Copy link

🎉 This PR is included in version 1.21.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants