Skip to content

How can I retrieve the full OpenAI response (including top_logprobs) when calling Langchain's ChatOpenAI? #17634

Answered by dosubot bot
EgorKraevTransferwise asked this question in Q&A
Discussion options

You must be logged in to vote

🤖

Hey @EgorKraevTransferwise, great to see you back here! Diving into another interesting challenge, I see. Let's see what we can do about that.

Based on the context provided, it seems you're looking to retrieve the full OpenAI response, including top_logprobs, when using the ChatOpenAI model within the LangChain framework.

In LangChain, you can use the logprobs=True parameter when calling the generate or agenerate method. This will include the logprobs in the generation_info of the response. Here's an example:

from langchain_community.llms.openai import ChatOpenAI
from langchain_community.llms.openai import HumanMessage

llm = ChatOpenAI()
result = llm.generate([[HumanMessage(content="I'…

Replies: 4 comments 6 replies

Comment options

You must be logged in to vote
2 replies
@EgorKraevTransferwise
Comment options

@EgorKraevTransferwise
Comment options

Answer selected by EgorKraevTransferwise
Comment options

You must be logged in to vote
1 reply
@EgorKraevTransferwise
Comment options

Comment options

You must be logged in to vote
1 reply
@EgorKraevTransferwise
Comment options

Comment options

You must be logged in to vote
2 replies
@gassis
Comment options

@sofiabmorgado
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants