We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问下,能用qwen2.5-coder-32b-instruct模型做对话问答和FIM代码智能补全的两种功能吗?还是说FIM智能补全模式情况下只能使用qwen2.5-coder-32b-base模型呢?
另外,我看官方只发布了qwen2.5-coder-32b-instruct的量化模型,是否有32b-base的官方量化模型呢?
The text was updated successfully, but these errors were encountered:
Hi, you can use Qwen2.5-Coder-32B-Instruct in either chat or FIM format.
However, Qwen2.5-Coder-32B-Instruct may perform slightly worse than the base model.
We have not released a quantized version of the 32B base model.
Sorry, something went wrong.
Hi, you can use Qwen2.5-Coder-32B-Instruct in either chat or FIM format. However, Qwen2.5-Coder-32B-Instruct may perform slightly worse than the base model. We have not released a quantized version of the 32B base model.
是的,instruct模型能够做一些补全,但是在FIM上补全上效果没有base模型好,如果不加<|repo_name|>这种说明,只是一个注释的话,会输出一些自然语言的回答结果
may be you can try to use the instruct model while maintaining the same fim structure as the base model, rather than converting to ChatML format.
No branches or pull requests
请问下,能用qwen2.5-coder-32b-instruct模型做对话问答和FIM代码智能补全的两种功能吗?还是说FIM智能补全模式情况下只能使用qwen2.5-coder-32b-base模型呢?
另外,我看官方只发布了qwen2.5-coder-32b-instruct的量化模型,是否有32b-base的官方量化模型呢?
The text was updated successfully, but these errors were encountered: