Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] 简洁阐述问题 怎么结合localai使用langchain-chatchat #5064

Open
TZJ12 opened this issue Nov 8, 2024 · 0 comments
Open

[BUG] 简洁阐述问题 怎么结合localai使用langchain-chatchat #5064

TZJ12 opened this issue Nov 8, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@TZJ12
Copy link

TZJ12 commented Nov 8, 2024

问题描述 / Problem Description
是否可以结合localai使用chatchat呢,也就是是实现chatchat不需要重启就能兼容任何openai风格的模型连接

复现问题的步骤 / Steps to Reproduce

预期的结果 / Expected Result
使用vllm启动另一个模型,不需要改变model_config重启,就可以连接新的模型

实际结果 / Actual Result
目前只有存在于model_config的在线模型列表中,才可进行连接

环境信息 / Environment Information

  • Langchain-Chatchat 2.10版本

  • 部署方式:源码部署

  • 使用的模型推理框架(Xinference / Ollama / OpenAI API 等): OpenAI API

  • 使用的 LLM 模型(GLM-4-9B / Qwen2-7B-Instruct 等):自己微调的模型

  • 使用的 Embedding 模型(bge-large-zh-v1.5 / m3e-base 等):bge-large-zh-v1.5

  • 使用的向量库类型 (faiss / milvus / pg_vector 等): faiss

  • 操作系统及版本 / Operating system and version: centos7

  • Python 版本 / Python version: 3.10.12

  • 推理使用的硬件(GPU / CPU / MPS / NPU 等) / Inference hardware (GPU / CPU / MPS / NPU, etc.): GPU

@TZJ12 TZJ12 added the bug Something isn't working label Nov 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant