parent
3da68b5ce3
commit
4f07384c66
|
|
@ -42,7 +42,7 @@
|
||||||
|
|
||||||
🚩 本项目未涉及微调、训练过程,但可利用微调或训练对本项目效果进行优化。
|
🚩 本项目未涉及微调、训练过程,但可利用微调或训练对本项目效果进行优化。
|
||||||
|
|
||||||
🌐 [AutoDL 镜像](https://www.codewithgpu.com/i/chatchat-space/Langchain-Chatchat/Langchain-Chatchat) 中 `v11` 版本所使用代码已更新至本项目 `v0.2.7` 版本。
|
🌐 [AutoDL 镜像](https://www.codewithgpu.com/i/chatchat-space/Langchain-Chatchat/Langchain-Chatchat) 中 `v13` 版本所使用代码已更新至本项目 `v0.2.9` 版本。
|
||||||
|
|
||||||
🐳 [Docker 镜像](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.6) 已经更新到 ```0.2.7``` 版本。
|
🐳 [Docker 镜像](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.6) 已经更新到 ```0.2.7``` 版本。
|
||||||
|
|
||||||
|
|
@ -67,10 +67,10 @@ docker run -d --gpus all -p 80:8501 registry.cn-beijing.aliyuncs.com/chatchat/ch
|
||||||
|
|
||||||
### 1. 环境配置
|
### 1. 环境配置
|
||||||
|
|
||||||
+ 首先,确保你的机器安装了 Python 3.8 - 3.10
|
+ 首先,确保你的机器安装了 Python 3.8 - 3.11
|
||||||
```
|
```
|
||||||
$ python --version
|
$ python --version
|
||||||
Python 3.10.12
|
Python 3.11.7
|
||||||
```
|
```
|
||||||
接着,创建一个虚拟环境,并在虚拟环境内安装项目的依赖
|
接着,创建一个虚拟环境,并在虚拟环境内安装项目的依赖
|
||||||
```shell
|
```shell
|
||||||
|
|
@ -88,6 +88,7 @@ $ pip install -r requirements_webui.txt
|
||||||
|
|
||||||
# 默认依赖包括基本运行环境(FAISS向量库)。如果要使用 milvus/pg_vector 等向量库,请将 requirements.txt 中相应依赖取消注释再安装。
|
# 默认依赖包括基本运行环境(FAISS向量库)。如果要使用 milvus/pg_vector 等向量库,请将 requirements.txt 中相应依赖取消注释再安装。
|
||||||
```
|
```
|
||||||
|
请注意,LangChain-Chatchat `0.2.x` 系列是针对 Langchain `0.0.x` 系列版本的,如果你使用的是 Langchain `0.1.x` 系列版本,需要降级。
|
||||||
### 2, 模型下载
|
### 2, 模型下载
|
||||||
|
|
||||||
如需在本地或离线环境下运行本项目,需要首先将项目所需的模型下载至本地,通常开源 LLM 与 Embedding 模型可以从 [HuggingFace](https://huggingface.co/models) 下载。
|
如需在本地或离线环境下运行本项目,需要首先将项目所需的模型下载至本地,通常开源 LLM 与 Embedding 模型可以从 [HuggingFace](https://huggingface.co/models) 下载。
|
||||||
|
|
|
||||||
|
|
@ -55,10 +55,10 @@ The main process analysis from the aspect of document process:
|
||||||
🚩 The training or fine-tuning are not involved in the project, but still, one always can improve performance by do
|
🚩 The training or fine-tuning are not involved in the project, but still, one always can improve performance by do
|
||||||
these.
|
these.
|
||||||
|
|
||||||
🌐 [AutoDL image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.5) is supported, and in v9 the codes are update
|
🌐 [AutoDL image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.5) is supported, and in v13 the codes are update
|
||||||
to v0.2.5.
|
to v0.2.9.
|
||||||
|
|
||||||
🐳 [Docker image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.5)
|
🐳 [Docker image](registry.cn-beijing.aliyuncs.com/chatchat/chatchat:0.2.7)
|
||||||
|
|
||||||
## Pain Points Addressed
|
## Pain Points Addressed
|
||||||
|
|
||||||
|
|
@ -98,6 +98,7 @@ $ pip install -r requirements_webui.txt
|
||||||
|
|
||||||
# 默认依赖包括基本运行环境(FAISS向量库)。如果要使用 milvus/pg_vector 等向量库,请将 requirements.txt 中相应依赖取消注释再安装。
|
# 默认依赖包括基本运行环境(FAISS向量库)。如果要使用 milvus/pg_vector 等向量库,请将 requirements.txt 中相应依赖取消注释再安装。
|
||||||
```
|
```
|
||||||
|
Please note that the LangChain-Chachat `0.2.x` series is for the Langchain `0.0.x` series version. If you are using the Langchain `0.1.x` series version, you need to downgrade.
|
||||||
|
|
||||||
### Model Download
|
### Model Download
|
||||||
|
|
||||||
|
|
|
||||||
Binary file not shown.
|
Before Width: | Height: | Size: 266 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 218 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 208 KiB |
|
|
@ -1,6 +1,5 @@
|
||||||
# API requirements
|
# API requirements
|
||||||
|
|
||||||
# On Windows system, install the cuda version manually from https://pytorch.org/
|
|
||||||
torch~=2.1.2
|
torch~=2.1.2
|
||||||
torchvision~=0.16.2
|
torchvision~=0.16.2
|
||||||
torchaudio~=2.1.2
|
torchaudio~=2.1.2
|
||||||
|
|
@ -8,30 +7,30 @@ xformers==0.0.23.post1
|
||||||
transformers==4.36.2
|
transformers==4.36.2
|
||||||
sentence_transformers==2.2.2
|
sentence_transformers==2.2.2
|
||||||
|
|
||||||
langchain==0.0.352
|
langchain==0.0.354
|
||||||
langchain-experimental==0.0.47
|
langchain-experimental==0.0.47
|
||||||
pydantic==1.10.13
|
pydantic==1.10.13
|
||||||
fschat==0.2.34
|
fschat==0.2.34
|
||||||
openai~=1.6.0
|
openai~=1.7.1
|
||||||
fastapi>=0.105
|
fastapi~=0.108.0
|
||||||
sse_starlette
|
sse_starlette==1.8.2
|
||||||
nltk>=3.8.1
|
nltk>=3.8.1
|
||||||
uvicorn>=0.24.0.post1
|
uvicorn>=0.24.0.post1
|
||||||
starlette~=0.27.0
|
starlette~=0.32.0
|
||||||
unstructured[all-docs]==0.11.0
|
unstructured[all-docs]==0.11.0
|
||||||
python-magic-bin; sys_platform == 'win32'
|
python-magic-bin; sys_platform == 'win32'
|
||||||
SQLAlchemy==2.0.19
|
SQLAlchemy==2.0.19
|
||||||
faiss-cpu~=1.7.4 # `conda install faiss-gpu -c conda-forge` if you want to accelerate with gpus
|
faiss-cpu~=1.7.4 # `conda install faiss-gpu -c conda-forge` if you want to accelerate with gpus
|
||||||
accelerate==0.24.1
|
accelerate~=0.24.1
|
||||||
spacy~=3.7.2
|
spacy~=3.7.2
|
||||||
PyMuPDF~=1.23.8
|
PyMuPDF~=1.23.8
|
||||||
rapidocr_onnxruntime==1.3.8
|
rapidocr_onnxruntime==1.3.8
|
||||||
requests>=2.31.0
|
requests~=2.31.0
|
||||||
pathlib>=1.0.1
|
pathlib~=1.0.1
|
||||||
pytest>=7.4.3
|
pytest~=7.4.3
|
||||||
numexpr>=2.8.6 # max version for py38
|
numexpr~=2.8.6 # max version for py38
|
||||||
strsimpy>=0.2.1
|
strsimpy~=0.2.1
|
||||||
markdownify>=0.11.6
|
markdownify~=0.11.6
|
||||||
tiktoken~=0.5.2
|
tiktoken~=0.5.2
|
||||||
tqdm>=4.66.1
|
tqdm>=4.66.1
|
||||||
websockets>=12.0
|
websockets>=12.0
|
||||||
|
|
@ -46,15 +45,14 @@ llama-index
|
||||||
# optional document loaders
|
# optional document loaders
|
||||||
|
|
||||||
# rapidocr_paddle[gpu]>=1.3.0.post5 # gpu accelleration for ocr of pdf and image files
|
# rapidocr_paddle[gpu]>=1.3.0.post5 # gpu accelleration for ocr of pdf and image files
|
||||||
jq>=1.6.0 # for .json and .jsonl files. suggest `conda install jq` on windows
|
jq==1.6.0 # for .json and .jsonl files. suggest `conda install jq` on windows
|
||||||
# html2text # for .enex files
|
|
||||||
beautifulsoup4~=4.12.2 # for .mhtml files
|
beautifulsoup4~=4.12.2 # for .mhtml files
|
||||||
pysrt~=1.1.2
|
pysrt~=1.1.2
|
||||||
|
|
||||||
# Online api libs dependencies
|
# Online api libs dependencies
|
||||||
|
|
||||||
zhipuai>=1.0.7, <=2.0.0 # zhipu
|
zhipuai==1.0.7 # zhipu
|
||||||
dashscope>=1.13.6 # qwen
|
dashscope==1.13.6 # qwen
|
||||||
# volcengine>=1.0.119 # fangzhou
|
# volcengine>=1.0.119 # fangzhou
|
||||||
|
|
||||||
# uncomment libs if you want to use corresponding vector store
|
# uncomment libs if you want to use corresponding vector store
|
||||||
|
|
@ -64,14 +62,14 @@ dashscope>=1.13.6 # qwen
|
||||||
|
|
||||||
# Agent and Search Tools
|
# Agent and Search Tools
|
||||||
|
|
||||||
arxiv>=2.0.0
|
arxiv~=2.1.0
|
||||||
youtube-search>=2.1.2
|
youtube-search~=2.1.2
|
||||||
duckduckgo-search>=3.9.9
|
duckduckgo-search~=3.9.9
|
||||||
metaphor-python>=0.1.23
|
metaphor-python~=0.1.23
|
||||||
|
|
||||||
# WebUI requirements
|
# WebUI requirements
|
||||||
|
|
||||||
streamlit~=1.29.0 # do remember to add streamlit to environment variables if you use windows
|
streamlit~=1.29.0
|
||||||
streamlit-option-menu>=0.3.6
|
streamlit-option-menu>=0.3.6
|
||||||
streamlit-chatbox==1.1.11
|
streamlit-chatbox==1.1.11
|
||||||
streamlit-modal>=0.1.0
|
streamlit-modal>=0.1.0
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,3 @@
|
||||||
# API requirements
|
|
||||||
|
|
||||||
# On Windows system, install the cuda version manually from https://pytorch.org/
|
|
||||||
torch~=2.1.2
|
torch~=2.1.2
|
||||||
torchvision~=0.16.2
|
torchvision~=0.16.2
|
||||||
torchaudio~=2.1.2
|
torchaudio~=2.1.2
|
||||||
|
|
@ -8,52 +5,52 @@ xformers==0.0.23.post1
|
||||||
transformers==4.36.2
|
transformers==4.36.2
|
||||||
sentence_transformers==2.2.2
|
sentence_transformers==2.2.2
|
||||||
|
|
||||||
langchain==0.0.352
|
langchain==0.0.354
|
||||||
langchain-experimental==0.0.47
|
langchain-experimental==0.0.47
|
||||||
pydantic==1.10.13
|
pydantic==1.10.13
|
||||||
fschat==0.2.34
|
fschat==0.2.34
|
||||||
openai~=1.6.0
|
openai~=1.7.1
|
||||||
fastapi>=0.105
|
fastapi~=0.108.0
|
||||||
sse_starlette
|
sse_starlette==1.8.2
|
||||||
nltk>=3.8.1
|
nltk>=3.8.1
|
||||||
uvicorn>=0.24.0.post1
|
uvicorn>=0.24.0.post1
|
||||||
starlette~=0.27.0
|
starlette~=0.32.0
|
||||||
unstructured[all-docs]==0.11.0
|
unstructured[all-docs]==0.11.0
|
||||||
python-magic-bin; sys_platform == 'win32'
|
python-magic-bin; sys_platform == 'win32'
|
||||||
SQLAlchemy==2.0.19
|
SQLAlchemy==2.0.19
|
||||||
faiss-cpu~=1.7.4 # `conda install faiss-gpu -c conda-forge` if you want to accelerate with gpus
|
faiss-cpu~=1.7.4 # `conda install faiss-gpu -c conda-forge` if you want to accelerate with gpus
|
||||||
accelerate==0.24.1
|
accelerate~=0.24.1
|
||||||
spacy~=3.7.2
|
spacy~=3.7.2
|
||||||
PyMuPDF~=1.23.8
|
PyMuPDF~=1.23.8
|
||||||
rapidocr_onnxruntime~=1.3.8
|
rapidocr_onnxruntime==1.3.8
|
||||||
requests>=2.31.0
|
requests~=2.31.0
|
||||||
pathlib>=1.0.1
|
pathlib~=1.0.1
|
||||||
pytest>=7.4.3
|
pytest~=7.4.3
|
||||||
numexpr>=2.8.6 # max version for py38
|
numexpr~=2.8.6 # max version for py38
|
||||||
strsimpy>=0.2.1
|
strsimpy~=0.2.1
|
||||||
markdownify>=0.11.6
|
markdownify~=0.11.6
|
||||||
tiktoken~=0.5.2
|
tiktoken~=0.5.2
|
||||||
tqdm>=4.66.1
|
tqdm>=4.66.1
|
||||||
websockets>=12.0
|
websockets>=12.0
|
||||||
numpy~=1.26.2
|
numpy~=1.24.4
|
||||||
pandas~=2.1.4
|
pandas~=2.0.3
|
||||||
einops>=0.7.0
|
einops>=0.7.0
|
||||||
transformers_stream_generator==0.0.4
|
transformers_stream_generator==0.0.4
|
||||||
vllm==0.2.6; sys_platform == "linux"
|
vllm==0.2.6; sys_platform == "linux"
|
||||||
httpx[brotli,http2,socks]~=0.25.2
|
httpx[brotli,http2,socks]==0.25.2
|
||||||
|
llama-index
|
||||||
|
|
||||||
# optional document loaders
|
# optional document loaders
|
||||||
|
|
||||||
rapidocr_paddle[gpu]>=1.3.0.post5 # gpu accelleration for ocr of pdf and image files
|
# rapidocr_paddle[gpu]>=1.3.0.post5 # gpu accelleration for ocr of pdf and image files
|
||||||
jq>=1.6.0 # for .json and .jsonl files. suggest `conda install jq` on windows
|
jq==1.6.0 # for .json and .jsonl files. suggest `conda install jq` on windows
|
||||||
# html2text # for .enex files
|
|
||||||
beautifulsoup4~=4.12.2 # for .mhtml files
|
beautifulsoup4~=4.12.2 # for .mhtml files
|
||||||
pysrt~=1.1.2
|
pysrt~=1.1.2
|
||||||
|
|
||||||
# Online api libs dependencies
|
# Online api libs dependencies
|
||||||
|
|
||||||
zhipuai>=1.0.7, <=2.0.0 # zhipu
|
zhipuai==1.0.7 # zhipu
|
||||||
dashscope>=1.13.6 # qwen
|
dashscope==1.13.6 # qwen
|
||||||
# volcengine>=1.0.119 # fangzhou
|
# volcengine>=1.0.119 # fangzhou
|
||||||
|
|
||||||
# uncomment libs if you want to use corresponding vector store
|
# uncomment libs if you want to use corresponding vector store
|
||||||
|
|
@ -63,7 +60,7 @@ dashscope>=1.13.6 # qwen
|
||||||
|
|
||||||
# Agent and Search Tools
|
# Agent and Search Tools
|
||||||
|
|
||||||
arxiv>=2.0.0
|
arxiv~=2.1.0
|
||||||
youtube-search>=2.1.2
|
youtube-search~=2.1.2
|
||||||
duckduckgo-search>=3.9.9
|
duckduckgo-search~=3.9.9
|
||||||
metaphor-python>=0.1.23
|
metaphor-python~=0.1.23
|
||||||
|
|
@ -1,60 +1,44 @@
|
||||||
# API requirements
|
# API requirements
|
||||||
|
|
||||||
# On Windows system, install the cuda version manually from https://pytorch.org/
|
langchain==0.0.354
|
||||||
# torch~=2.1.2
|
|
||||||
# torchvision~=0.16.2
|
|
||||||
# torchaudio~=2.1.2
|
|
||||||
# xformers==0.0.23.post1
|
|
||||||
# transformers==4.36.2
|
|
||||||
# sentence_transformers==2.2.2
|
|
||||||
|
|
||||||
langchain==0.0.352
|
|
||||||
langchain-experimental==0.0.47
|
langchain-experimental==0.0.47
|
||||||
pydantic==1.10.13
|
pydantic==1.10.13
|
||||||
fschat==0.2.34
|
fschat==0.2.34
|
||||||
openai~=1.6.0
|
openai~=1.7.1
|
||||||
fastapi>=0.105
|
fastapi~=0.108.0
|
||||||
sse_starlette
|
sse_starlette==1.8.2
|
||||||
nltk>=3.8.1
|
nltk>=3.8.1
|
||||||
uvicorn>=0.24.0.post1
|
uvicorn>=0.24.0.post1
|
||||||
starlette~=0.27.0
|
starlette~=0.32.0
|
||||||
unstructured[docx,csv]==0.11.0 # add pdf if need
|
unstructured[all-docs]==0.11.0
|
||||||
python-magic-bin; sys_platform == 'win32'
|
python-magic-bin; sys_platform == 'win32'
|
||||||
SQLAlchemy==2.0.19
|
SQLAlchemy==2.0.19
|
||||||
faiss-cpu~=1.7.4 # `conda install faiss-gpu -c conda-forge` if you want to accelerate with gpus
|
faiss-cpu~=1.7.4
|
||||||
# accelerate==0.24.1
|
requests~=2.31.0
|
||||||
# spacy~=3.7.2
|
pathlib~=1.0.1
|
||||||
# PyMuPDF~=1.23.8
|
pytest~=7.4.3
|
||||||
# rapidocr_onnxruntime~=1.3.8
|
numexpr~=2.8.6 # max version for py38
|
||||||
requests>=2.31.0
|
strsimpy~=0.2.1
|
||||||
pathlib>=1.0.1
|
markdownify~=0.11.6
|
||||||
pytest>=7.4.3
|
tiktoken~=0.5.2
|
||||||
numexpr>=2.8.6 # max version for py38
|
|
||||||
strsimpy>=0.2.1
|
|
||||||
markdownify>=0.11.6
|
|
||||||
# tiktoken~=0.5.2
|
|
||||||
tqdm>=4.66.1
|
tqdm>=4.66.1
|
||||||
websockets>=12.0
|
websockets>=12.0
|
||||||
numpy~=1.26.2
|
numpy~=1.24.4
|
||||||
pandas~=2.1.4
|
pandas~=2.0.3
|
||||||
# einops>=0.7.0
|
einops>=0.7.0
|
||||||
# transformers_stream_generator==0.0.4
|
transformers_stream_generator==0.0.4
|
||||||
# vllm==0.2.6; sys_platform == "linux"
|
vllm==0.2.6; sys_platform == "linux"
|
||||||
httpx[brotli,http2,socks]~=0.25.2
|
httpx[brotli,http2,socks]==0.25.2
|
||||||
|
requests
|
||||||
|
pathlib
|
||||||
|
pytest
|
||||||
|
|
||||||
# optional document loaders
|
|
||||||
|
|
||||||
rapidocr_paddle[gpu]>=1.3.0.post5 # gpu accelleration for ocr of pdf and image files
|
|
||||||
jq>=1.6.0 # for .json and .jsonl files. suggest `conda install jq` on windows
|
|
||||||
# html2text # for .enex files
|
|
||||||
beautifulsoup4~=4.12.2 # for .mhtml files
|
|
||||||
pysrt~=1.1.2
|
|
||||||
|
|
||||||
# Online api libs dependencies
|
# Online api libs dependencies
|
||||||
|
|
||||||
zhipuai>=1.0.7, <=2.0.0 # zhipu
|
zhipuai==1.0.7
|
||||||
dashscope>=1.13.6 # qwen
|
dashscope==1.13.6
|
||||||
# volcengine>=1.0.119 # fangzhou
|
# volcengine>=1.0.119
|
||||||
|
|
||||||
# uncomment libs if you want to use corresponding vector store
|
# uncomment libs if you want to use corresponding vector store
|
||||||
# pymilvus>=2.3.4
|
# pymilvus>=2.3.4
|
||||||
|
|
@ -63,16 +47,17 @@ dashscope>=1.13.6 # qwen
|
||||||
|
|
||||||
# Agent and Search Tools
|
# Agent and Search Tools
|
||||||
|
|
||||||
arxiv>=2.0.0
|
arxiv~=2.1.0
|
||||||
youtube-search>=2.1.2
|
youtube-search~=2.1.2
|
||||||
duckduckgo-search>=3.9.9
|
duckduckgo-search~=3.9.9
|
||||||
metaphor-python>=0.1.23
|
metaphor-python~=0.1.23
|
||||||
|
|
||||||
# WebUI requirements
|
# WebUI requirements
|
||||||
|
|
||||||
streamlit~=1.29.0 # do remember to add streamlit to environment variables if you use windows
|
streamlit>=1.29.0
|
||||||
streamlit-option-menu>=0.3.6
|
streamlit-option-menu>=0.3.6
|
||||||
streamlit-chatbox==1.1.11
|
streamlit-antd-components>=0.3.0
|
||||||
|
streamlit-chatbox>=1.1.11
|
||||||
streamlit-modal>=0.1.0
|
streamlit-modal>=0.1.0
|
||||||
streamlit-aggrid>=0.3.4.post3
|
streamlit-aggrid>=0.3.4.post3
|
||||||
httpx[brotli,http2,socks]>=0.25.2
|
httpx[brotli,http2,socks]>=0.25.2
|
||||||
|
|
|
||||||
|
|
@ -1,8 +1,9 @@
|
||||||
# WebUI requirements
|
# WebUI requirements
|
||||||
|
|
||||||
streamlit~=1.29.0 # do remember to add streamlit to environment variables if you use windows
|
streamlit>=1.29.0
|
||||||
streamlit-option-menu>=0.3.6
|
streamlit-option-menu>=0.3.6
|
||||||
streamlit-chatbox==1.1.11
|
streamlit-antd-components>=0.3.0
|
||||||
|
streamlit-chatbox>=1.1.11
|
||||||
streamlit-modal>=0.1.0
|
streamlit-modal>=0.1.0
|
||||||
streamlit-aggrid>=0.3.4.post3
|
streamlit-aggrid>=0.3.4.post3
|
||||||
httpx[brotli,http2,socks]>=0.25.2
|
httpx[brotli,http2,socks]>=0.25.2
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue