Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🌊 feat: add streaming support for o1 models #4760

Merged
merged 2 commits into from
Nov 20, 2024

Conversation

hongkai-neu
Copy link
Contributor

Add streaming support for o1 models

Issue #4759 : Enhancement: Add streaming support for o1-preview and o1-mini

Summary

This PR adds streaming support for o1 models by removing the if condition that was checking if the model was o1 before deleting the modelOptions.stream and modelOptions.stop parameters in the OpenAIClient.js file.

Change Type

  • New feature (non-breaking change which adds functionality)

Testing

Backend unit tests are passed:
npm run test:api

clip

Checklist

  • My code adheres to this project's style guidelines
  • I have performed a self-review of my own code

@hongkai-neu
Copy link
Contributor Author

Streaming with o1-mini and o1-preview work fine with OpenAI/OpenRouter but haven’t been tested with Azure OpenAI due to no API access.

1 similar comment
@hongkai-neu
Copy link
Contributor Author

Streaming with o1-mini and o1-preview work fine with OpenAI/OpenRouter but haven’t been tested with Azure OpenAI due to no API access.

@danny-avila danny-avila changed the title feat: add streaming support for o1 models 🌊 feat: add streaming support for o1 models Nov 20, 2024
@danny-avila danny-avila merged commit 7d5be68 into danny-avila:main Nov 20, 2024
1 check passed
@hongkai-neu hongkai-neu deleted the streaming-support-o1 branch November 21, 2024 18:03
owengo pushed a commit to openwengo/LibreChat that referenced this pull request Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants