Skip to content

Conversation

Robin7831
Copy link

支持使用vllm等框架部署的思考-非思考融合模型在思考模式下的规范化输出

Copy link

cla-assistant bot commented Sep 1, 2025

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.


dingyi seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

1 similar comment
Copy link

cla-assistant bot commented Sep 1, 2025

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.


dingyi seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

@zijiren233
Copy link
Member

是分离think吗,可以看看已实现的 thinksplit 插件 https://github.com/labring/aiproxy/tree/main/core/relay/plugin/thinksplit

@Robin7831
Copy link
Author

是分离think吗,可以看看已实现的 thinksplit 插件 https://github.com/labring/aiproxy/tree/main/core/relay/plugin/thinksplit
对的,但是这个插件只对reasoning model(也就是输出总会带think标签的)有效,qwen3也好新出的ds-v3.1也好,对这类需要在请求里通过额外字段控制是否输出思考内容的似乎不太奏效?

@zijiren233
Copy link
Member

控制思考使用UnmarshalGeneralThinkingFromNode,部分adaptor已经实现了某种通用参数转模型的思考参数。比如ali的qwen

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants