Go to file
zR 6ed87954b2
修复科大讯飞token问题和Azure的token问题 (#1894)
Co-authored-by: zR <zRzRzRzRzRzRzR>
2023-10-27 13:51:59 +08:00
.github Update close-issue.yml:提示改成中文,改到凌晨05:30运行 (#1456) 2023-09-13 10:06:54 +08:00
chains fix: langchain warnings for import from root 2023-09-27 21:55:02 +08:00
common search_engine_chat bug 2023-08-24 17:25:54 +08:00
configs Dev (#1892) 2023-10-27 13:14:48 +08:00
document_loaders 一些细节优化 (#1891) 2023-10-27 11:52:44 +08:00
embeddings Dev (#1822) 2023-10-22 00:00:15 +08:00
img Dev (#1817) 2023-10-21 11:35:51 +08:00
knowledge_base/samples 修改了部分Agent Prompt 修改了一些适配问题 (#1839) 2023-10-23 14:53:14 +08:00
nltk_data add nltk_data 2023-04-18 10:07:19 +08:00
server 修复科大讯飞token问题和Azure的token问题 (#1894) 2023-10-27 13:51:59 +08:00
tests 优化在线 API ,支持 completion 和 embedding,简化在线 API 开发方式 (#1886) 2023-10-26 22:44:48 +08:00
text_splitter update requirements.txt, requirements_api.txt, test_different_splitter.py and chinese_recursive_text_splitter.py 2023-09-14 22:59:05 +08:00
webui_pages 修改qianfan-api使用原始post访问,qianfan sdk无法访问 2023-10-27 13:42:16 +08:00
.gitignore Add: standard python ignore files (#1530) 2023-09-19 18:04:34 +08:00
LICENSE Create LICENSE 2023-04-07 11:41:10 +08:00
README.md Dev (#1817) 2023-10-21 11:35:51 +08:00
README_en.md Zilliz修复 (#1874) 2023-10-25 21:59:26 +08:00
copy_config_example.py 1. 增加baichuan-api支持;2.增加批量复制configs下.example文件为.py文件的脚本copy_config_example.py;3. 更新model_config.py.example 2023-09-29 13:16:14 +08:00
init_database.py release 0.2.6 (#1815) 2023-10-20 23:16:06 +08:00
release.py Add release.py 2023-04-16 02:47:31 +08:00
requirements.txt 一些细节优化 (#1891) 2023-10-27 11:52:44 +08:00
requirements_api.txt 一些细节优化 (#1891) 2023-10-27 11:52:44 +08:00
requirements_lite.txt 一些细节优化 (#1891) 2023-10-27 11:52:44 +08:00
requirements_webui.txt release 0.2.6 (#1815) 2023-10-20 23:16:06 +08:00
shutdown_all.sh update readme.md, shutdown_all.sh: 在Linux上使用ctrl+C退出可能会由于linux的多进程机制导致multiprocessing遗留孤儿进程 2023-08-25 16:16:44 +08:00
startup.py 支持lite模式:无需安装torch等重依赖,通过在线API实现LLM对话和搜索引擎对话 (#1860) 2023-10-25 08:30:23 +08:00
webui.py 支持lite模式:无需安装torch等重依赖,通过在线API实现LLM对话和搜索引擎对话 (#1860) 2023-10-25 08:30:23 +08:00

README_en.md

🌍 中文文档

📃 LangChain-Chatchat (formerly Langchain-ChatGLM):

A LLM application aims to implement knowledge and search engine based QA based on Langchain and open-source or remote LLM API.


Table of Contents

Introduction

🤖 A Q&A application based on local knowledge base implemented using the idea of langchain. The goal is to build a KBQA(Knowledge based Q&A) solution that is friendly to Chinese scenarios and open source models and can run both offline and online.

💡 Inspired by document.ai and ChatGLM-6B Pull Request , we build a local knowledge base question answering application that can be implemented using an open source model or remote LLM api throughout the process. In the latest version of this project, FastChat is used to access Vicuna, Alpaca, LLaMA, Koala, RWKV and many other models. Relying on langchain , this project supports calling services through the API provided based on FastAPI, or using the WebUI based on Streamlit.

Relying on the open source LLM and Embedding models, this project can realize full-process offline private deployment. At the same time, this project also supports the call of OpenAI GPT API- and Zhipu API, and will continue to expand the access to various models and remote APIs in the future.

⛓️ The implementation principle of this project is shown in the graph below. The main process includes: loading files -> reading text -> text segmentation -> text vectorization -> question vectorization -> matching the top-k most similar to the question vector in the text vector -> The matched text is added to prompt as context and question -> submitte to LLM to generate an answer.

📺video introduction

实现原理图

The main process analysis from the aspect of document process:

实现原理图2

🚩 The training or fine-tuning are not involved in the project, but still, one always can improve performance by do these.

🌐 AutoDL image is supported, and in v9 the codes are update to v0.2.5.

🐳 Docker image

Pain Points Addressed

This project is a solution for enhancing knowledge bases with fully localized inference, specifically addressing the pain points of data security and private deployments for businesses. This open-source solution is under the Apache License and can be used for commercial purposes for free, with no fees required. We support mainstream local large prophecy models and Embedding models available in the market, as well as open-source local vector databases. For a detailed list of supported models and databases, please refer to our Wiki

Quick Start

Environment Setup

First, make sure your machine has Python 3.10 installed.

$ python --version
Python 3.10.12

Then, create a virtual environment and install the project's dependencies within the virtual environment.


# 拉取仓库
$ git clone https://github.com/chatchat-space/Langchain-Chatchat.git

# 进入目录
$ cd Langchain-Chatchat

# 安装全部依赖
$ pip install -r requirements.txt 
$ pip install -r requirements_api.txt
$ pip install -r requirements_webui.txt  

# 默认依赖包括基本运行环境FAISS向量库。如果要使用 milvus/pg_vector 等向量库,请将 requirements.txt 中相应依赖取消注释再安装。

Model Download

If you need to run this project locally or in an offline environment, you must first download the required models for the project. Typically, open-source LLM and Embedding models can be downloaded from HuggingFace.

Taking the default LLM model used in this project, THUDM/chatglm2-6b, and the Embedding model moka-ai/m3e-base as examples:

To download the models, you need to first install Git LFS and then run:

$ git lfs install
$ git clone https://huggingface.co/THUDM/chatglm2-6b
$ git clone https://huggingface.co/moka-ai/m3e-base

Initializing the Knowledge Base and Config File

Follow the steps below to initialize your own knowledge base and config file:

$ python copy_config_example.py
$ python init_database.py --recreate-vs

One-Click Launch

To start the project, run the following command:

$ python startup.py -a

Example of Launch Interface

  1. FastAPI docs interface

  1. webui page
  • Web UI dialog page:

img

  • Web UI knowledge base management page:

Note

The above instructions are provided for a quick start. If you need more features or want to customize the launch method, please refer to the Wiki.


Contact Us

Telegram

Telegram

WeChat Group、

二维码

WeChat Official Account

图片