Skip to content

Azure OpenAI content streaming with asynchronous filter still streams responses in large bursts #22246

Answered by chrislrobert
chrislrobert asked this question in Q&A
Discussion options

You must be logged in to vote

@dosu, that wasn't required.

To anybody else who struggles with this: changing the openai_api_version parameter from 2023-05-15 to 2024-02-01 resolved the issue for me. For whatever reason, the older API version wasn't supporting the newer async filter (unbuffered stream) functionality.

Replies: 2 comments 6 replies

Comment options

You must be logged in to vote
6 replies
@dosubot
Comment options

@chrislrobert
Comment options

@dosubot
Comment options

@chrislrobert
Comment options

Answer selected by chrislrobert
@dosubot
Comment options

Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants