Commit Graph

17 Commits

Author SHA1 Message Date
Zhi-guo Huang 71b528a2d1
1. update readme;2. 解决多卡启动问题;3. 更新lora加载方式说明 (#1079)
* fix chat and knowledge_base_chat

* 更新多卡部署

* update readme

* update api and webui:
1. add download_doc to api
2. return local path or http url in kowledge_base_chat depends on
   no_remote_api
3. change assistant avater in webui

* 解决多卡启动问题

* fix chat and knowledge_base_chat

* 更新readme的lora加载方式

* update readme

* 更新readme

---------

Co-authored-by: liunux4odoo <liunu@qq.com>
2023-08-14 17:35:51 +08:00
imClumsyPanda 8a4d9168fa update import pkgs and format 2023-08-10 21:26:05 +08:00
liunux4odoo ba3335efb8 update llm_api: move fastchat logs to LOG_PATH 2023-08-09 22:43:45 +08:00
liunux4odoo f7d465b7d4 bug fix 2023-08-03 12:11:18 +08:00
hzg0601 18a94fcf45 Merge branch 'dev_fastchat' of github.com:chatchat-space/langchain-ChatGLM into dev_fastchat 2023-08-01 18:02:52 +08:00
hzg0601 15e67a4d3e 1.*在config里将所有fastchat的命令行参数加入;2.*加入启动和停止fastchat的shell脚本;3. **增加了通过命令行启动所有fastchat服务的python脚本llm_api_sh.py;4. 修改了默认的config日志格式 2023-08-01 17:59:20 +08:00
imClumsyPanda c8a75ab11f update llm_api.py and webui.py 2023-08-01 14:33:18 +08:00
liunux4odoo 9e2b411b01 cuda error with multiprocessing, change model_worker to main process 2023-07-31 11:18:57 +08:00
hzg0601 47dfb6cd8b udpate llm_api 2023-07-31 11:00:33 +08:00
liunux4odoo 463659f0ba llm_api升级到fschat==0.2.20,支持百川模型 2023-07-30 23:16:47 +08:00
imClumsyPanda 05ccc0346e update test code 2023-07-30 00:48:07 +08:00
imClumsyPanda 41444fd4b5 update requirements.txt and llm_api.py 2023-07-30 00:24:34 +08:00
liunux4odoo 1a7271e966 fix: model_worker need global variable: args 2023-07-29 23:22:25 +08:00
liunux4odoo c880412300 use multiprocessing to run fastchat server 2023-07-29 23:01:24 +08:00
hzg0601 97ee4686a1 更新会议记录 2023-07-28 16:16:59 +08:00
hzg0601 154cad1b45 会议记录 2023-07-28 16:12:57 +08:00
imClumsyPanda dcf49a59ef v0.2.0 first commit 2023-07-27 23:22:07 +08:00