We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
目前使用 webui 方式进行对话只能单独调用 kb_chat/file_chat 或多功能对话,我理解多功能对话就是直接使用大模型本身; 但是直接调用多功能对话时完全不能调用 localdb。我尝试使用如下形式:
query_text = "我肚子疼,周一可以去看哪些医生" tools = list(requests.get(f"http://127.0.0.1:7861/tools").json()["data"]) data = { "messages": [ {"role": "user", "content": query_text}, ], "model": "glm4-chat", "stream": True, "tools": tools, "temperature": 0.7, }
输入 tools 时可以在对话时调用 localdb,不输入 tools 就会直接使用大模型本身;请问如何修改可以使多功能对话也具有这种能力呢。想单点调试查看 preprocesspromt 对于这个过程的影响,应该修改哪个部分呢?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
目前使用 webui 方式进行对话只能单独调用 kb_chat/file_chat 或多功能对话,我理解多功能对话就是直接使用大模型本身;
但是直接调用多功能对话时完全不能调用 localdb。我尝试使用如下形式:
输入 tools 时可以在对话时调用 localdb,不输入 tools 就会直接使用大模型本身;请问如何修改可以使多功能对话也具有这种能力呢。想单点调试查看 preprocesspromt 对于这个过程的影响,应该修改哪个部分呢?
The text was updated successfully, but these errors were encountered: