Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue #4604: '[Bug]: Disable function calling for DeepSeek' #4616

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

enyst
Copy link
Collaborator

@enyst enyst commented Oct 29, 2024

End-user friendly description of the problem this fixes or functionality that this introduces

  • Include this change in the Release Notes. If checked, you must provide an end-user friendly description for your change below

Give a summary of what the PR does, explaining any non-trivial design decisions

This PR attempts to fix Deepseek not working with the expected function calling format.


Link of any specific issues this addresses

@enyst enyst marked this pull request as draft October 29, 2024 20:11
"""
# Set function calling support for DeepSeek models
if 'deepseek' in self.model.lower():
self.supports_function_calling = True
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changes to this file are not necessary. You can revert this file, from the main branch version. Make sure to only revert this file, and commit.

@@ -80,17 +80,32 @@ def serialize_model(self) -> dict:
elif isinstance(item, ImageContent) and self.vision_enabled:
content.extend(d)

# For DeepSeek, we need to ensure content is a string when using tool calls
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For DeepSeek we need to ensure content is a string. But this change in message.py applies to all LLMs supported by liteLLM! That is a serious problem. We need to make the content into a single string ONLY for some LLMs:

  • that don't support lists of dicts in content, but require a string instead.

This is important! The current function calling implementation uses the format of Anthropic Claude variants. Think again what we need to do in order to keep it working for Anthropic, and implement it differently for DeepSeek, while NOT breaking everything else.

This can help:

  • verify how we define SUPPORT_PROMPT_CACHE_MODELS in llm.py
  • verify how we use such module globals in llm.py
  • DeepSeek might work with function calling, if it has the right format with a single string for content, at least for system message. Make a special case this time, so that you don't break everything else.
  • there are LLMs that don't work with prompt caching, nor with vision, nor with function calling. For those, make a second serialization algo, to make their contents into one content string per message (not a list of contents).

'arguments': tc.function.arguments
}
} for tc in self.tool_calls
]
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change also breaks everything else, when trying to make it work for DeepSeek. Use a special case or attribute for DeepSeek rather than this.

self.model_info is not None
and self.model_info.get('supports_function_calling', False)
(self.model_info is not None and self.model_info.get('supports_function_calling', False))
or 'deepseek' in self.config.model.lower() # DeepSeek models support function calling
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You don't need to check if DeepSeek supports function calling. Let's assume model_info already returns True (it does).

@enyst enyst added the fix-me Attempt to fix this issue with OpenHands. label Oct 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fix-me Attempt to fix this issue with OpenHands.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants