merge master to dev
This commit is contained in:
commit
dd7223b3a9
|
|
@ -43,7 +43,7 @@
|
||||||
|
|
||||||
🚩 本项目未涉及微调、训练过程,但可利用微调或训练对本项目效果进行优化。
|
🚩 本项目未涉及微调、训练过程,但可利用微调或训练对本项目效果进行优化。
|
||||||
|
|
||||||
🌐 [AutoDL 镜像](https://www.codewithgpu.com/i/chatchat-space/Langchain-Chatchat/Langchain-Chatchat) 中 `v8` 版本所使用代码已更新至本项目 `v0.2.4` 版本。
|
🌐 [AutoDL 镜像](https://www.codewithgpu.com/i/chatchat-space/Langchain-Chatchat/Langchain-Chatchat) 中 `v9` 版本所使用代码已更新至本项目 `v0.2.5` 版本。
|
||||||
|
|
||||||
🐳 [Docker 镜像](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.3) 已经更新到 ```0.2.3``` 版本, 如果想体验最新内容请源码安装。
|
🐳 [Docker 镜像](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.3) 已经更新到 ```0.2.3``` 版本, 如果想体验最新内容请源码安装。
|
||||||
|
|
||||||
|
|
@ -87,7 +87,7 @@ $ pip install -r requirements_webui.txt
|
||||||
|
|
||||||
如需在本地或离线环境下运行本项目,需要首先将项目所需的模型下载至本地,通常开源 LLM 与 Embedding 模型可以从 [HuggingFace](https://huggingface.co/models) 下载。
|
如需在本地或离线环境下运行本项目,需要首先将项目所需的模型下载至本地,通常开源 LLM 与 Embedding 模型可以从 [HuggingFace](https://huggingface.co/models) 下载。
|
||||||
|
|
||||||
以本项目中默认使用的 LLM 模型 [THUDM/chatglm2-6b](https://huggingface.co/THUDM/chatglm2-6b) 与 Embedding 模型 [moka-ai/m3e-base](https://huggingface.co/moka-ai/m3e-base) 为例:
|
以本项目中默认使用的 LLM 模型 [THUDM/ChatGLM2-6B](https://huggingface.co/THUDM/chatglm2-6b) 与 Embedding 模型 [moka-ai/m3e-base](https://huggingface.co/moka-ai/m3e-base) 为例:
|
||||||
|
|
||||||
下载模型需要先[安装 Git LFS](https://docs.github.com/zh/repositories/working-with-files/managing-large-files/installing-git-large-file-storage),然后运行
|
下载模型需要先[安装 Git LFS](https://docs.github.com/zh/repositories/working-with-files/managing-large-files/installing-git-large-file-storage),然后运行
|
||||||
|
|
||||||
|
|
@ -113,11 +113,11 @@ $ python startup.py -a
|
||||||
|
|
||||||
如果正常启动,你将能看到以下界面
|
如果正常启动,你将能看到以下界面
|
||||||
|
|
||||||
1. FastAPI docs 界面
|
1. FastAPI Docs 界面
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
2. webui启动界面示例:
|
2. Web UI 启动界面示例:
|
||||||
|
|
||||||
- Web UI 对话界面:
|
- Web UI 对话界面:
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -41,9 +41,9 @@ The main process analysis from the aspect of document process:
|
||||||
|
|
||||||
🚩 The training or fined-tuning are not involved in the project, but still, one always can improve performance by do these.
|
🚩 The training or fined-tuning are not involved in the project, but still, one always can improve performance by do these.
|
||||||
|
|
||||||
🌐 [AutoDL image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.0) is supported, and in v7 the codes are update to v0.2.3.
|
🌐 [AutoDL image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.5) is supported, and in v9 the codes are update to v0.2.5.
|
||||||
|
|
||||||
🐳 [Docker image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.0)
|
🐳 [Docker image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.5)
|
||||||
|
|
||||||
## Pain Points Addressed
|
## Pain Points Addressed
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -5,4 +5,4 @@ from .server_config import *
|
||||||
from .prompt_config import *
|
from .prompt_config import *
|
||||||
|
|
||||||
|
|
||||||
VERSION = "v0.2.6-preview"
|
VERSION = "v0.2.6"
|
||||||
|
|
|
||||||
|
Before Width: | Height: | Size: 4.1 MiB After Width: | Height: | Size: 4.1 MiB |
Binary file not shown.
|
After Width: | Height: | Size: 108 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 188 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 240 KiB |
|
|
@ -345,8 +345,8 @@ def get_model_path(model_name: str, type: str = None) -> Optional[str]:
|
||||||
return str(path)
|
return str(path)
|
||||||
return path_str # THUDM/chatglm06b
|
return path_str # THUDM/chatglm06b
|
||||||
|
|
||||||
# 从server_config中获取服务信息
|
|
||||||
|
|
||||||
|
# 从server_config中获取服务信息
|
||||||
|
|
||||||
def get_model_worker_config(model_name: str = None) -> dict:
|
def get_model_worker_config(model_name: str = None) -> dict:
|
||||||
'''
|
'''
|
||||||
|
|
|
||||||
|
|
@ -170,6 +170,8 @@ def dialogue_page(api: ApiRequest):
|
||||||
key="selected_kb",
|
key="selected_kb",
|
||||||
)
|
)
|
||||||
kb_top_k = st.number_input("匹配知识条数:", 1, 20, VECTOR_SEARCH_TOP_K)
|
kb_top_k = st.number_input("匹配知识条数:", 1, 20, VECTOR_SEARCH_TOP_K)
|
||||||
|
|
||||||
|
## Bge 模型会超过1
|
||||||
score_threshold = st.slider("知识匹配分数阈值:", 0.0, 1.0, float(SCORE_THRESHOLD), 0.01)
|
score_threshold = st.slider("知识匹配分数阈值:", 0.0, 1.0, float(SCORE_THRESHOLD), 0.01)
|
||||||
|
|
||||||
elif dialogue_mode == "搜索引擎问答":
|
elif dialogue_mode == "搜索引擎问答":
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue