整理了下issue的问题,方便新手入门 (#669)

* Create ceshi

* Delete ceshi

* Create ceshi

* langchain-问题整理

下一步分类

* Delete ceshi
This commit is contained in:
fengyunzaidushi 2023-06-19 19:33:11 +08:00 committed by GitHub
parent 017b34647e
commit d7d235463e
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
6 changed files with 992 additions and 0 deletions

View File

@ -0,0 +1,173 @@
,title,file,url,detail,id
0,加油~以及一些建议,2023-03-31.0002,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/2,加油,我认为你的方向是对的。,0
1,当前的运行环境是什么windows还是Linux,2023-04-01.0003,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/3,当前的运行环境是什么windows还是Linuxpython是什么版本,1
2,请问这是在CLM基础上运行吗,2023-04-01.0004,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/4,请问是不是需要本地安装好clm并正常运行的情况下再按文中的步骤执行才能运行起来,2
3,[复现问题] 构造 prompt 时从知识库中提取的文字乱码,2023-04-01.0005,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/5,hi我在尝试复现 README 中的效果,也使用了 ChatGLM-6B 的 README 作为输入文本,但发现从知识库中提取的文字是乱码,导致构造的 prompt 不可用。想了解如何解决这个问题。,3
4,后面能否加入上下文对话功能?,2023-04-02.0006,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/6,目前的get_wiki_agent_answer函数中已经实现了历史消息传递的功能后面我再确认一下是否有langchain中model调用过程中是否传递了chat_history。,4
5,请问纯cpu可以吗,2023-04-03.0007,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/7,很酷的实现极大地开拓了我的眼界很顺利的在gpu机器上运行了,5
6,运行报错AttributeError: 'NoneType' object has no attribute 'message_types_by_name',2023-04-03.0008,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/8,报错:,6
7,运行环境GPU需要多大的,2023-04-03.0009,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/9,如果按照THUDM/ChatGLM-6B的说法使用的GPU大小应该在13GB左右但运行脚本后占用了24GB还不够。,7
8,请问本地知识的格式是什么?,2023-04-03.0010,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/10,已测试格式包括docx、md文件中的文本信息具体格式可以参考 [langchain文档](https://python.langchain.com/en/latest/modules/indexes/document_loaders/examples/unstructured_file.html?highlight=pdf#),8
9,24G的显存还是爆掉了是否支持双卡运行,2023-04-03.0011,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/11,RuntimeError: CUDA out of memory. Tried to allocate 96.00 MiB (GPU 0; 23.70 GiB total capacity; 22.18 GiB already allocated; 12.75 MiB free; 22.18 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF,9
10,你怎么知道embeddings方式和模型训练时候的方式是一样的?,2023-04-03.0012,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/12,embedding和LLM的方式不用一致embedding能够解决语义检索的需求就行。这个项目里用到embedding是在对本地知识建立索引和对问句转换成向量的过程。,10
11,是否能提供本地知识文件的格式?,2023-04-04.0013,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/13,是否能提供本地知识文件的格式?,11
12,是否可以像清华原版跑在8G一以下的卡,2023-04-04.0016,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/16,是否可以像清华原版跑在8G一以下的卡我的8G卡爆显存了🤣🤣🤣,12
13,请教一下langchain协调使用向量库和chatGLM工作的,2023-04-05.0018,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/18,代码里面这段是创建问答模型的会接入ChatGLM和本地语料的向量库langchain回答的时候是怎么个优先顺序先搜向量库没有再找chatglm么 还是什么机制?,13
14,在mac m2max上抛出了ValueError: 150001 is not in list这个异常,2023-04-05.0019,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/19,我把chatglm_llm.py加载模型的代码改成如下,14
15,程序运行后一直卡住,2023-04-05.0020,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/20,感谢作者的付出,不过本人在运行时出现了问题,请大家帮助。,15
16,问一下chat_history的逻辑,2023-04-06.0022,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/22,感谢开源。,16
17,为什么每次运行都会loading checkpoint,2023-04-06.0023,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/23,我把这个embeding模型下载到本地后无法正常启动。,17
18,本地知识文件能否上传一些示例?,2023-04-06.0025,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/25,如题,怎么构造知识文件,效果更好?能否提供一个样例,18
19,What version of you are using?,2023-04-06.0026,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/26,"Hi Panda, I saw the `pip install -r requirements` command in README, and want to confirm you are using python2 or python3? because my pip and pip3 version are all is 22.3.",19
20,有兴趣交流本项目应用的朋友可以加一下微信群,2023-04-07.0027,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/27,![IMG_1630](https://user-images.githubusercontent.com/5668498/230533162-8b9bfcdd-249c-4efe-b066-4f9ba2ce9f23.jpeg),20
21,本地知识越多,回答时检索的时间是否会越长,2023-04-07.0029,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/29,是的 因为需要进行向量匹配检索,21
22,爲啥最後還是報錯 哭。。,2023-04-07.0030,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/30,Failed to import transformers.models.t5.configuration_t5 because of the following error (look up to see,22
23,对话到第二次的时候就报错UnicodeDecodeError: 'utf-8' codec can't decode,2023-04-07.0031,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/31,对话第一次是没问题的,模型返回输出后又给到请输入你的问题,我再输入问题就报错,23
24,用的in4的量化版本推理的时候显示需要申请10Gb的显存,2023-04-07.0033,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/33,"File ""/root/.cache/huggingface/modules/transformers_modules/chatglm-6b-int4-qe/modeling_chatglm.py"", line 581, in forward",24
25,使用colab运行python3.9,提示包导入有问题,2023-04-07.0034,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/34,"from ._util import is_directory, is_path",25
26,运行失败Loading checkpoint未达到100%被kill了请问下是什么原因,2023-04-07.0035,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/35,日志如下:,26
27,弄了个交流群,自己弄好多细节不会,大家技术讨论 加connection-image 我来拉你,2023-04-08.0036,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/36,自己搞好多不清楚的,一起来弄吧。。准备搞个部署问题的解决文档出来,27
28,Error using the new version with langchain,2023-04-09.0043,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/43,Error with the new changes:,28
29,程序报错torch.cuda.OutOfMemoryError如何解决,2023-04-10.0044,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/44,报错详细信息如下:,29
30,qa的训练数据格式是如何设置的,2023-04-10.0045,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/45,本项目不是使用微调的方式,所以并不涉及到训练过程。,30
31,The FileType.UNK file type is not supported in partition. 解决办法,2023-04-10.0046,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/46,ValueError: Invalid file /home/yawu/Documents/langchain-ChatGLM-master/data. The FileType.UNK file type is not supported in partition.,31
32,如何读取多个txt文档,2023-04-10.0047,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/47,如题请教一下如何读取多个txt文档示例代码中只给了读一个文档的案例这个input我换成string之后也只能指定一个文档无法用通配符指定多个文档也无法传入多个文件路径的列表。,32
33,nltk package unable to either download or load local nltk_data folder,2023-04-10.0049,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/49,I'm running this project on an offline Windows Server environment so I download the Punkt and averaged_perceptron_tagger tokenizer in this directory:,33
34,requirements.txt中需要指定langchain版本,2023-04-11.0055,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/55,langchain版本0.116下无法引入RetrievalQA需要指定更高版本0.136版本下无问题),34
35,Demo演示无法给出输出内容,2023-04-12.0059,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/59,你好测试了项目自带新闻稿示例和自行上传的一个文本可以加载进去但是无法给出答案请问属于什么情况如何解决谢谢。PS: 1、今天早上刚下载全部代码2、硬件服务器满足要求3、按操作说明正常操作。,35
36,群人数过多无法进群,求帮忙拉进群,2023-04-12.0061,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/61,您好您的群人数超过了200人目前无法通过二维码加群请问您方便加我微信拉我进群吗万分感谢,36
37,群人数已满,求大佬拉入群,2023-04-12.0062,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/62,已在README中更新拉群二维码,37
38,requirements中langchain版本错误,2023-04-12.0065,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/65,langchain版本应该是0.0.12而不是0.0.120,38
39,Linux : Searchd in,2023-04-13.0068,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/68,import nltk,39
40,No sentence-transformers model found,2023-04-13.0069,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/69,加载不了这个模型,错误原因是找不到这个模型,但是路径是配置好了的,40
41,Error loading punkt: <urlopen error [Errno 111] Connection,2023-04-13.0070,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/70,运行knowledge_based_chatglm.py出现nltk报错具体情况如下,41
42,[不懂就问] ptuning数据集格式,2023-04-13.0072,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/72,大家好请教 微调数据集的格式有什么玄机吗?我看 ChatGLM-6B/ptuning/readme.md的demo数据集ADGEN里content为啥都写成 类型#裙*风格#简约 这种格式的?这里面有啥玄机的? 特此请教,42
43,Embedding model请教,2023-04-13.0074,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/74,您好我看到项目里的embedding模型用的是GanymedeNil/text2vec-large-chinese请问这个项目里的embedding模型可以直接用ChatGLM嘛,43
44,Macbook M1 运行 webui.py 时报错请问是否可支持M系列芯片,2023-04-13.0080,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/80,```,44
45,new feature: 添加对P-tunningv2微调后的模型支持,2023-04-14.0099,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/99,能否添加新的功能,对使用[P-tunningv2](https://github.com/THUDM/ChatGLM-6B/tree/main/ptuning)微调chatglm后的模型提供加载支持,45
46,txt文件加载成功但读取报错,2023-04-15.0106,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/106,最新版的代码。比较诡异的是我的电脑是没有D盘的报错信息里怎么有个D盘出来了...,46
47,模型加载成功?文件无法导入。,2023-04-15.0107,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/107,所有模型均在本地。,47
48,请问用的什么操作系统呢?,2023-04-16.0110,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/110,ubuntu、centos还是windows,48
49,报错ModuleNotFoundError: No module named 'configs.model_config',2023-04-17.0112,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/112,更新代码后运行webui.py报错ModuleNotFoundError: No module named 'configs.model_config'。未查得解决方法。,49
50,问特定问题会出现爆显存,2023-04-17.0116,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/116,正常提问没问题。,50
51,loading进不去,2023-04-18.0127,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/127,在linux系统上python webui.py之后打开网页一直在loading是不是跟我没装detectron2有关呢,51
52,本地知识内容数量限制?,2023-04-18.0129,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/129,本地知识文件类型是txt超过5条以上的数据提问的时候就爆显存了。,52
53,我本来也计划做一个类似的产品,看来不用从头开始做了,2023-04-18.0130,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/130,文本切割,还有优化空间吗?微信群已经加不进去了。,53
54,load model failed. 加载模型失败,2023-04-18.0132,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/132,```,54
55,如何在webui里回答时同时返回引用的本地数据内容,2023-04-18.0133,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/133,如题,55
56,交流群满200人加不了了能不能给个负责人的联系方式拉我进群,2023-04-20.0143,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/143,同求,56
57,No sentence-transformers model found with name /text2vec/‘,但是再路径下面确实有模型文件,2023-04-20.0145,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/145,另外The dtype of attention mask (torch.int64) is not bool,57
58,请问加载模型的路径在哪里修改默认好像前面会带上transformers_modules.,2023-04-20.0148,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/148,"<img width=""1181"" alt=""1681977897052"" src=""https://user-images.githubusercontent.com/30926001/233301106-3846680a-d842-41d2-874e-5b6514d732c4.png"">",58
59,为啥放到方法调用会出错,这个怎么处理?,2023-04-20.0150,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/150,```python,59
60,No sentence-transformers model found with name C:\Users\Administrator/.cache\torch\sentence_transformers\GanymedeNil_text2vec-large-chinese. Creating a new one with MEAN pooling.,2023-04-21.0154,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/154,卡在这块很久是正常现象吗,60
61,微信群需要邀请才能加入,2023-04-21.0155,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/155,RT给个个人联系方式白,61
62,No sentence-transformers model found with name GanymedeNil/text2vec-large-chinese. Creating a new one with MEAN pooling,2023-04-21.0156,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/156,ls GanymedeNil/text2vec-large-chinese,62
63,embedding会加载两次,2023-04-23.0159,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/159,你好,为什么要这样设置呢,这样会加载两次呀。,63
64,扫二维码加的那个群,群成员满了进不去了,2023-04-23.0160,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/160,如题,64
65,执行python3 cli_demo.py 报错AttributeError: 'NoneType' object has no attribute 'chat',2023-04-24.0163,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/163,"刚开始怀疑是内存不足问题换成int4,int4-qe也不行有人知道是什么原因吗",65
66,匹配得分,2023-04-24.0167,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/167,在示例cli_demo.py中返回的匹配文本没有对应的score可以加上这个feature吗,66
67,大佬有计划往web_ui.py加入打字机功能吗,2023-04-25.0170,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/170,目前在载入了知识库后单张V100 32G在回答垂直领域的问题时也需要20S以上没有打字机逐字输出的使用体验还是比较煎熬的....,67
68,Is it possible to use a verctorDB for the embedings?,2023-04-25.0171,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/171,"when I play, I have to load the local data again and again when to start. I wonder if it is possible to use",68
69,请问通过lora训练官方模型得到的微调模型文件该如何加载,2023-04-25.0173,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/173,通过lora训练的方式得到以下文件:,69
70,from langchain.chains import RetrievalQA的代码在哪里,2023-04-25.0174,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/174,local_doc_qa.py,70
71,哪里有knowledge_based_chatglm.py文件怎么找不到了是被替换成cli_demo.py文件了吗,2023-04-26.0175,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/175,哪里有knowledge_based_chatglm.py文件怎么找不到了是被替换成cli_demo.py文件了吗,71
72,AttributeError: 'Chatbot' object has no attribute 'value',2023-04-26.0177,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/177,Traceback (most recent call last):,72
73,控制台调api.py报警告,2023-04-26.0178,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/178,"you must pass the application as an import string to enable ""reload"" or ""workers""",73
74,如何加入群聊,2023-04-27.0183,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/183,微信群超过200人了需要邀请如何加入呢,74
75,如何将Chatglm和本地知识相结合,2023-04-27.0185,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/185,您好我想请教一下怎么才能让知识库匹配到的文本和chatglm生成的相结合而不是说如果没搜索到就说根据已知信息无法回答该问题谢谢,75
76,一点建议,2023-04-27.0189,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/189,1.weiui的get_vector_store方法里面添加一个判断以兼容gradio版本导致的上传异常,76
77,windows环境下按照教程配置好conda环境git完项目修改完模型路径相关内容后运行demo报错缺少,2023-04-28.0194,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/194,报错代码如下:,77
78,ValueError: too many values to unpack (expected 2),2023-04-28.0198,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/198,"When i tried to use the non-streaming, `ValueError: too many values to unpack (expected 2)` error came out.",78
79,加载doc后覆盖原本知识,2023-04-28.0201,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/201,加载较大量级的私有知识库后,原本的知识会被覆盖,79
80,自定义知识库回答效果很差,2023-04-28.0203,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/203,"请问加了自定义知识库知识库,回答效果很差,是因为数据量太小的原因么",80
81,python310下安装pycocotools失败提示低版本cython实际已安装高版本,2023-04-29.0208,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/208,RT纯离线环境安装依赖安装的十分艰难最后碰到pycocotools始终无法安装上求教方法,81
82,[FEATURE] 支持 RWKV 模型(目前已有 pip package & rwkv.cpp 等等),2023-05-01.0216,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/216,您好,我是 RWKV 的作者介绍见https://zhuanlan.zhihu.com/p/626083366,82
83,[BUG] 为啥主机/服务器不联网不能正常启动服务?,2023-05-02.0220,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/220,**问题描述 / Problem Description**,83
84,[BUG] 简洁阐述问题 / Concise description of the issue,2023-05-03.0222,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/222,**local variable 'torch' referenced before assignment**,84
85,不支持txt文件的中文输入,2023-05-04.0235,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/235,"vs_path, _ = local_doc_qa.init_knowledge_vector_store(filepath)",85
86,文件均未成功加载,请检查依赖包或替换为其他文件再次上传。 文件未成功加载,请重新上传文件,2023-05-05.0237,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/237,请大佬帮忙解决,谢谢!,86
87,[BUG] 使用多卡时chatglm模型加载两次,2023-05-05.0241,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/241,chatglm_llm.py文件下第129行先加载了一次chatglm模型第143行又加载了一次,87
88,[BUG] similarity_search_with_score_by_vector函数返回多个doc时的score结果错误,2023-05-06.0252,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/252,**问题描述 / Problem Description**,88
89,可以再建一个交流群吗,这个群满了进不去。,2023-05-06.0255,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/255,上午应该已经在readme里更新过了如果不能添加可能是网页缓存问题可以试试看直接扫描img/qr_code_12.jpg,89
90,请问这是什么错误哇KeyError: 'serialized_input',2023-05-06.0257,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/257,运行“python webui.py” 后这是什么错误?怎么解决啊?,90
91,修改哪里的代码可以再cpu上跑,2023-05-06.0258,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/258,**问题描述 / Problem Description**,91
92,ModuleNotFoundError: No module named 'modelscope',2023-05-07.0266,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/266,安装这个,92
93,加载lora微调模型时lora参数加载成功但显示模型未成功加载,2023-05-08.0270,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/270,什么原因呀?,93
94,[BUG] 运行webui.py报错name 'EMBEDDING_DEVICE' is not defined,2023-05-08.0274,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/274,解决了我修改model_config时候把这个变量改错了,94
95,基于ptuning训练完成新老模型都进行了加载但是只有新的,2023-05-08.0280,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/280,licitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.,95
96,[BUG] 使用chatyuan模型时对话Errorhas no attribute 'stream_chat',2023-05-08.0282,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/282,**问题描述 / Problem Description**,96
97,chaglm调用过程中 _call提示有一个 stop,2023-05-09.0286,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/286,**功能描述 / Feature Description**,97
98,Logger._log() got an unexpected keyword argument 'end',2023-05-10.0295,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/295,使用cli_demo的时候加载一个普通txt文件输入问题后报错“TypeError: Logger._log() got an unexpected keyword argument 'end'”,98
99,[BUG] 请问可以解释下这个FAISS.similarity_search_with_score_by_vector = similarity_search_with_score_by_vector的目的吗,2023-05-10.0296,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/296,我不太明白这个库自己写的similarity_search_with_score_by_vector方法做的事情因为langchain原版的similarity_search_with_score_by_vector只是search faiss之后把返回的topk句子组合起来。我觉得原版理解起来没什么问题但是这个库里自己写的我就没太看明白多做了什么其他的事情因为没有注释。,99
100,[BUG] Windows下上传中文文件名文件faiss无法生成向量数据库文件,2023-05-11.0318,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/318,**问题描述 / Problem Description**,100
101,cli_demo中的流式输出能否接着前一答案输出?,2023-05-11.0320,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/320,现有流式输出结果样式为:,101
102,内网部署时网页无法加载,能否增加离线静态资源,2023-05-12.0326,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/326,内网部署时网页无法加载,能否增加离线静态资源,102
103,我想把文件字符的编码格式改为encoding='utf-8'在哪修改呢因为会有ascii codec can't decode byte报错,2023-05-14.0360,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/360,上传中文的txt文件时报错编码格式为utf-8,103
104,Batches的进度条是在哪里设置的?能否关闭显示?,2023-05-15.0366,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/366,"使用cli_demo.py进行命令行测试时,每句回答前都有个Batches的进度条",104
105,ImportError: dlopen: cannot load any more object with static TLS or Segmentation fault,2023-05-15.0368,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/368,**问题描述 / Problem Description**,105
106,读取PDF时报错,2023-05-16.0373,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/373,在Colab上执行cli_demo.py时在路径文件夹里放了pdf文件在加载的过程中会显示错误然后无法加载PDF文件,106
107,[BUG] webui报错 InvalidURL,2023-05-16.0375,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/375,python 版本3.8.16,107
108,[FEATURE] 如果让回答不包含出处,应该怎么处理,2023-05-16.0380,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/380,**功能描述 / Feature Description**,108
109,加载PDF文件时出现 unsupported colorspace for 'png',2023-05-16.0381,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/381,**问题描述 / Problem Description**,109
110,'ascii' codec can't encode characters in position 14-44: ordinal not in range(128) 经典bug,2023-05-16.0382,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/382,添加了知识库之后进行对话,之后再新增知识库就会出现这个问题。,110
111,微信群人数超过200了扫码进不去了群主可以再创建一个新群吗,2023-05-17.0391,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/391,**功能描述 / Feature Description**,111
112,TypeError: 'ListDocsResponse' object is not subscriptable,2023-05-17.0393,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/393,应该是用remain_docs.code和remain_docs.data吧,112
113,[BUG] 加载chatglm模型报错'NoneType' object has no attribute 'message_types_by_name',2023-05-17.0398,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/398,**问题描述 / Problem Description**,113
114,[BUG] 执行 python webui.py 没有报错但是ui界面提示 Something went wrong Expecting value: line 1 column 1 (char 0,2023-05-18.0399,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/399,**环境配置**,114
115,启动后调用api接口正常过一会就不断的爆出 Since the angle classifier is not initialized,2023-05-18.0404,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/404,**问题描述 / Problem Description**,115
116,[BUG] write_check_file方法中open函数未指定编码,2023-05-18.0408,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/408,"def write_check_file(filepath, docs):",116
117,导入的PDF中存在图片有大概率出现 “unsupported colorspace for 'png'”异常,2023-05-18.0409,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/409,"pix = fitz.Pixmap(doc, img[0])",117
118,请问流程图是用什么软件画的,2023-05-18.0410,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/410,draw.io,118
119,mac 加载模型失败,2023-05-19.0417,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/417,Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.,119
120,使用GPU本地运行知识库问答提问第一个问题出现异常。,2023-05-20.0419,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/419,配置文件model_config.py为,120
121,想加入讨论群,2023-05-20.0420,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/420,OK,121
122,有没有直接调用LLM的API目前只有知识库的API,2023-05-22.0426,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/426,-------------------------------------------------------------------------------,122
123,上传文件后出现 ERROR __init__() got an unexpected keyword argument 'autodetect_encoding',2023-05-22.0428,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/428,"上传文件后出现这个问题ERROR 2023-05-22 11:46:19,568-1d: __init__() got an unexpected keyword argument 'autodetect_encoding'",123
124,想问下README中用到的流程图用什么软件画的,2023-05-22.0431,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/431,**功能描述 / Feature Description**,124
125,No matching distribution found for langchain==0.0.174,2023-05-23.0436,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/436,ERROR: Could not find a version that satisfies the requirement langchain==0.0.174 ,125
126,[FEATURE] bing是必须的么,2023-05-23.0437,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/437,从这个[脚步](https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/configs/model_config.py#L129)里面发现需要申请bing api如果不申请纯用模型推理不可吗,126
127,同一台环境下部署了5.22号更新的langchain-chatglm v0.1.13和之前的版本,回复速度明显变慢,2023-05-23.0442,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/442,新langchain-chatglm v0.1.13版本速度很慢,127
128,Error reported during startup,2023-05-23.0443,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/443,Traceback (most recent call last):,128
129,"ValueError: not enough values to unpack (expected 2, got 1)on of the issue",2023-05-24.0449,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/449,"File "".cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py"", line 1280, in chat",129
130,[BUG] API部署流式输出的函数少了个question,2023-05-24.0451,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/451,**问题描述 / Problem Description**,130
131,项目结构的简洁性保持,2023-05-24.0454,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/454,**功能描述 / Feature Description**,131
132,项目群扫码进不去了,2023-05-24.0455,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/455,项目群扫码进不去了是否可以加一下微信拉我进群谢谢微信号daniel-0527,132
133,请求拉我入群讨论海硕一枚专注于LLM等相关技术,2023-05-24.0461,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/461,**功能描述 / Feature Description**,133
134,[BUG] chatglm-6b模型报错OSError: Error no file named pytorch_model.bin found in directory /chatGLM/model/model-6b,2023-05-26.0474,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/474,**1、简述**,134
135,现在本项目交流群二维码扫描不进去了,需要群主通过,2023-05-27.0478,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/478,现在本项目交流群二维码扫描不进去了,需要群主通过,135
136,RuntimeError: Only Tensors of floating point and complex dtype can require gradients,2023-05-28.0483,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/483,刚更新了最新版本:,136
137,"RuntimeError: ""LayerNormKernelImpl"" not implemented for 'Half'",2023-05-28.0484,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/484,"已经解决了 params 只用两个参数 {'trust_remote_code': True, 'torch_dtype': torch.float16}",137
138,[BUG] 文件未成功加载,请重新上传文件,2023-05-31.0504,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/504,webui.py,138
139,[BUG] bug 17 pdf和pdf为啥还不一样呢为啥有的pdf能识别有的pdf识别不了呢,2023-05-31.0506,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/506,bug 17 pdf和pdf为啥还不一样呢为啥有的pdf能识别有的pdf识别不了呢,139
140,[FEATURE] 简洁阐述功能 / Concise description of the feature,2023-05-31.0513,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/513,**功能描述 / Feature Description**,140
141,[BUG] webui.py 加载chatglm-6b-int4 失败,2023-06-02.0524,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/524,**问题描述 / Problem Description**,141
142,[BUG] webui.py 加载chatglm-6b模型异常,2023-06-02.0525,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/525,**问题描述 / Problem Description**,142
143,增加对chatgpt的embedding和api调用的支持,2023-06-02.0531,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/531,能否支持openai的embedding api和对话的api,143
144,[FEATURE] 调整模型下载的位置,2023-06-02.0537,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/537,模型默认下载到 $HOME/.cache/huggingface/,当 C 盘空间不足时无法完成模型的下载。configs/model_config.py 中也没有调整模型位置的参数。,144
145,[BUG] langchain=0.0.174 出错,2023-06-04.0543,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/543,**问题描述 / Problem Description**,145
146,[BUG] 更新后加载本地模型路径不正确,2023-06-05.0545,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/545,**问题描述 / Problem Description**,146
147,SystemError: 8bit 模型需要 CUDA 支持,或者改用量化后模型!,2023-06-06.0550,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/550,"docker 部署后,启动docker,过会儿容器会自动退出,logs报错 SystemError: 8bit 模型需要 CUDA 支持,或者改用量化后模型! [NVIDIA Container Toolkit](https://github.com/NVIDIA/nvidia-container-toolkit) 也已经安装了",147
148,[BUG] 上传知识库超过1M报错,2023-06-06.0556,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/556,**问题描述 / Problem Description**,148
149,打开跨域访问后仍然报错,不能请求,2023-06-06.0560,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/560,报错信息:,149
150,dialogue_answering 里面的代码是不是没有用到?,没有看到调用,2023-06-07.0571,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/571,dialogue_answering 是干啥的,150
151,[BUG] 响应速度极慢应从哪里入手优化48C/128G/8卡,2023-06-07.0573,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/573,运行环境ubuntu20.04,151
152,纯CPU环境下运行cli_demo时报错提示找不到nvcuda.dll,2023-06-08.0576,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/576,本地部署环境是纯CPU之前的版本在纯CPU环境下能正常运行但上传本地知识库经常出现encode问题。今天重新git项目后运行时出现如下问题请问该如何解决。,152
153,如何加载本地的embedding模型text2vec-large-chinese模型文件,2023-06-08.0582,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/582,"因为需要离线部署所以要把模型放到本地我修改了chains/local_doc_qa.py中的HuggingFaceEmbeddings()在其中加了一个cache_folder的参数保证下载的文件在cache_folder中model_name是text2vec-large-chinese。如cache_folder='/home/xx/model/text2vec-large-chinese', model_name='text2vec-large-chinese',这样仍然需要联网下载报错,请问大佬如何解决该问题?",153
154,ChatGLM-6B 在另外服务器安装好了请问如何修改model.cofnig.py 来使用它的接口呢??,2023-06-09.0588,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/588,我本来想在这加一个api base url 但是运行web.py 发现 还是会去连huggingface 下载模型,154
155,[BUG] raise partially initialized module 'charset_normalizer' has no attribute 'md__mypyc' when call interface `upload_file`,2023-06-10.0591,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/591,**问题描述 / Problem Description**,155
156,[BUG] raise OSError: [Errno 101] Network is unreachable when call interface upload_file and upload .pdf files,2023-06-10.0592,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/592,**问题描述 / Problem Description**,156
157,如果直接用vicuna作为基座大模型需要修改的地方有哪些,2023-06-12.0596,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/596,vicuna模型有直接转换好的没有也就是llama转换之后的vicuna。,157
158,[BUG] 通过cli.py调用api时抛出AttributeError: 'NoneType' object has no attribute 'get'错误,2023-06-12.0598,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/598,通过`python cli.py start api --ip localhost --port 8001` 命令调用api时抛出,158
159,[BUG] 通过cli.py调用api时直接报错`langchain-ChatGLM: error: unrecognized arguments: start cli`,2023-06-12.0601,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/601,通过python cli.py start cli启动cli_demo时报错,159
160,[BUG] error: unrecognized arguments: --model-dir conf/models/,2023-06-12.0602,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/602,关键字参数修改了吗?有没有文档啊?大佬,160
161,[BUG] 上传文件全部失败,2023-06-12.0603,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/603,ERROR: Exception in ASGI application,161
162,[BUG] config 使用 chatyuan 无法启动,2023-06-12.0604,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/604,"""chatyuan"": {",162
163,使用fashchat api之后后台报错APIError 如图所示,2023-06-12.0606,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/606,我按照https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/docs/fastchat.md,163
164,[BUG] 启用上下文关联每次embedding搜索到的内容都会比前一次多一段,2023-06-13.0613,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/613,**问题描述 / Problem Description**,164
165,local_doc_qa.py中MyFAISS.from_documents() 这个语句看不太懂。MyFAISS类中没有这个方法其父类FAISS和VectorStore中也只有from_texts方法[BUG] 简洁阐述问题 / Concise description of the issue,2023-06-14.0619,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/619,local_doc_qa.py中MyFAISS.from_documents() 这个语句看不太懂。MyFAISS类中没有这个方法其父类FAISS和VectorStore中也只有from_texts方法,165
166,[BUG] TypeError: similarity_search_with_score_by_vector() got an unexpected keyword argument 'filter',2023-06-14.0624,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/624,**问题描述 / Problem Description**,166
167,please delete this issue,2023-06-15.0633,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/633,"sorry, incorrect submission. Please remove this issue!",167
168,[BUG] vue前端镜像构建失败,2023-06-15.0635,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/635,**问题描述 / Problem Description**,168
169,ChatGLM-6B模型能否回答英文问题,2023-06-15.0640,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/640,大佬请问一下如果本地知识文档是英文ChatGLM-6B模型能否回答英文问题不能的话有没有替代的模型推荐期待你的回复谢谢,169
170,[BUG] 简洁阐述问题 / Concise description of the issue,2023-06-16.0644,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/644,**问题描述 / Problem Description**,170
171,KeyError: 3224,2023-06-16.0645,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/645,```,171
1 title file url detail id
2 0 加油~以及一些建议 2023-03-31.0002 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/2 加油,我认为你的方向是对的。 0
3 1 当前的运行环境是什么,windows还是Linux 2023-04-01.0003 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/3 当前的运行环境是什么,windows还是Linux,python是什么版本? 1
4 2 请问这是在CLM基础上运行吗? 2023-04-01.0004 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/4 请问是不是需要本地安装好clm并正常运行的情况下,再按文中的步骤执行才能运行起来? 2
5 3 [复现问题] 构造 prompt 时从知识库中提取的文字乱码 2023-04-01.0005 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/5 hi,我在尝试复现 README 中的效果,也使用了 ChatGLM-6B 的 README 作为输入文本,但发现从知识库中提取的文字是乱码,导致构造的 prompt 不可用。想了解如何解决这个问题。 3
6 4 后面能否加入上下文对话功能? 2023-04-02.0006 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/6 目前的get_wiki_agent_answer函数中已经实现了历史消息传递的功能,后面我再确认一下是否有langchain中model调用过程中是否传递了chat_history。 4
7 5 请问:纯cpu可以吗? 2023-04-03.0007 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/7 很酷的实现,极大地开拓了我的眼界!很顺利的在gpu机器上运行了 5
8 6 运行报错:AttributeError: 'NoneType' object has no attribute 'message_types_by_name' 2023-04-03.0008 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/8 报错: 6
9 7 运行环境:GPU需要多大的? 2023-04-03.0009 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/9 如果按照THUDM/ChatGLM-6B的说法,使用的GPU大小应该在13GB左右,但运行脚本后,占用了24GB还不够。 7
10 8 请问本地知识的格式是什么? 2023-04-03.0010 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/10 已测试格式包括docx、md文件中的文本信息,具体格式可以参考 [langchain文档](https://python.langchain.com/en/latest/modules/indexes/document_loaders/examples/unstructured_file.html?highlight=pdf#) 8
11 9 24G的显存还是爆掉了,是否支持双卡运行 2023-04-03.0011 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/11 RuntimeError: CUDA out of memory. Tried to allocate 96.00 MiB (GPU 0; 23.70 GiB total capacity; 22.18 GiB already allocated; 12.75 MiB free; 22.18 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF 9
12 10 你怎么知道embeddings方式和模型训练时候的方式是一样的? 2023-04-03.0012 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/12 embedding和LLM的方式不用一致,embedding能够解决语义检索的需求就行。这个项目里用到embedding是在对本地知识建立索引和对问句转换成向量的过程。 10
13 11 是否能提供本地知识文件的格式? 2023-04-04.0013 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/13 是否能提供本地知识文件的格式? 11
14 12 是否可以像清华原版跑在8G一以下的卡? 2023-04-04.0016 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/16 是否可以像清华原版跑在8G一以下的卡?我的8G卡爆显存了🤣🤣🤣 12
15 13 请教一下langchain协调使用向量库和chatGLM工作的 2023-04-05.0018 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/18 代码里面这段是创建问答模型的,会接入ChatGLM和本地语料的向量库,langchain回答的时候是怎么个优先顺序?先搜向量库,没有再找chatglm么? 还是什么机制? 13
16 14 在mac m2max上抛出了ValueError: 150001 is not in list这个异常 2023-04-05.0019 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/19 我把chatglm_llm.py加载模型的代码改成如下 14
17 15 程序运行后一直卡住 2023-04-05.0020 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/20 感谢作者的付出,不过本人在运行时出现了问题,请大家帮助。 15
18 16 问一下chat_history的逻辑 2023-04-06.0022 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/22 感谢开源。 16
19 17 为什么每次运行都会loading checkpoint 2023-04-06.0023 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/23 我把这个embeding模型下载到本地后,无法正常启动。 17
20 18 本地知识文件能否上传一些示例? 2023-04-06.0025 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/25 如题,怎么构造知识文件,效果更好?能否提供一个样例 18
21 19 What version of you are using? 2023-04-06.0026 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/26 Hi Panda, I saw the `pip install -r requirements` command in README, and want to confirm you are using python2 or python3? because my pip and pip3 version are all is 22.3. 19
22 20 有兴趣交流本项目应用的朋友可以加一下微信群 2023-04-07.0027 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/27 ![IMG_1630](https://user-images.githubusercontent.com/5668498/230533162-8b9bfcdd-249c-4efe-b066-4f9ba2ce9f23.jpeg) 20
23 21 本地知识越多,回答时检索的时间是否会越长 2023-04-07.0029 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/29 是的 因为需要进行向量匹配检索 21
24 22 爲啥最後還是報錯 哭。。 2023-04-07.0030 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/30 Failed to import transformers.models.t5.configuration_t5 because of the following error (look up to see 22
25 23 对话到第二次的时候就报错UnicodeDecodeError: 'utf-8' codec can't decode 2023-04-07.0031 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/31 对话第一次是没问题的,模型返回输出后又给到请输入你的问题,我再输入问题就报错 23
26 24 用的in4的量化版本,推理的时候显示需要申请10Gb的显存 2023-04-07.0033 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/33 File "/root/.cache/huggingface/modules/transformers_modules/chatglm-6b-int4-qe/modeling_chatglm.py", line 581, in forward 24
27 25 使用colab运行,python3.9,提示包导入有问题 2023-04-07.0034 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/34 from ._util import is_directory, is_path 25
28 26 运行失败,Loading checkpoint未达到100%被kill了,请问下是什么原因? 2023-04-07.0035 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/35 日志如下: 26
29 27 弄了个交流群,自己弄好多细节不会,大家技术讨论 加connection-image 我来拉你 2023-04-08.0036 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/36 自己搞好多不清楚的,一起来弄吧。。准备搞个部署问题的解决文档出来 27
30 28 Error using the new version with langchain 2023-04-09.0043 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/43 Error with the new changes: 28
31 29 程序报错torch.cuda.OutOfMemoryError如何解决? 2023-04-10.0044 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/44 报错详细信息如下: 29
32 30 qa的训练数据格式是如何设置的 2023-04-10.0045 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/45 本项目不是使用微调的方式,所以并不涉及到训练过程。 30
33 31 The FileType.UNK file type is not supported in partition. 解决办法 2023-04-10.0046 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/46 ValueError: Invalid file /home/yawu/Documents/langchain-ChatGLM-master/data. The FileType.UNK file type is not supported in partition. 31
34 32 如何读取多个txt文档? 2023-04-10.0047 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/47 如题,请教一下如何读取多个txt文档?示例代码中只给了读一个文档的案例,这个input我换成string之后也只能指定一个文档,无法用通配符指定多个文档,也无法传入多个文件路径的列表。 32
35 33 nltk package unable to either download or load local nltk_data folder 2023-04-10.0049 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/49 I'm running this project on an offline Windows Server environment so I download the Punkt and averaged_perceptron_tagger tokenizer in this directory: 33
36 34 requirements.txt中需要指定langchain版本 2023-04-11.0055 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/55 langchain版本0.116下无法引入RetrievalQA,需要指定更高版本(0.136版本下无问题) 34
37 35 Demo演示无法给出输出内容 2023-04-12.0059 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/59 你好,测试了项目自带新闻稿示例和自行上传的一个文本,可以加载进去,但是无法给出答案,请问属于什么情况,如何解决,谢谢。PS: 1、今天早上刚下载全部代码;2、硬件服务器满足要求;3、按操作说明正常操作。 35
38 36 群人数过多无法进群,求帮忙拉进群 2023-04-12.0061 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/61 您好,您的群人数超过了200人,目前无法通过二维码加群,请问您方便加我微信拉我进群吗?万分感谢 36
39 37 群人数已满,求大佬拉入群 2023-04-12.0062 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/62 已在README中更新拉群二维码 37
40 38 requirements中langchain版本错误 2023-04-12.0065 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/65 langchain版本应该是0.0.12而不是0.0.120 38
41 39 Linux : Searchd in 2023-04-13.0068 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/68 import nltk 39
42 40 No sentence-transformers model found 2023-04-13.0069 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/69 加载不了这个模型,错误原因是找不到这个模型,但是路径是配置好了的 40
43 41 Error loading punkt: <urlopen error [Errno 111] Connection 2023-04-13.0070 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/70 运行knowledge_based_chatglm.py,出现nltk报错,具体情况如下: 41
44 42 [不懂就问] ptuning数据集格式 2023-04-13.0072 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/72 大家好请教 微调数据集的格式有什么玄机吗?我看 ChatGLM-6B/ptuning/readme.md的demo数据集ADGEN里content为啥都写成 类型#裙*风格#简约 这种格式的?这里面有啥玄机的? 特此请教 42
45 43 Embedding model请教 2023-04-13.0074 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/74 您好,我看到项目里的embedding模型用的是:GanymedeNil/text2vec-large-chinese,请问这个项目里的embedding模型可以直接用ChatGLM嘛? 43
46 44 Macbook M1 运行 webui.py 时报错,请问是否可支持M系列芯片 2023-04-13.0080 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/80 ``` 44
47 45 new feature: 添加对P-tunningv2微调后的模型支持 2023-04-14.0099 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/99 能否添加新的功能,对使用[P-tunningv2](https://github.com/THUDM/ChatGLM-6B/tree/main/ptuning)微调chatglm后的模型提供加载支持 45
48 46 txt文件加载成功,但读取报错 2023-04-15.0106 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/106 最新版的代码。比较诡异的是我的电脑是没有D盘的,报错信息里怎么有个D盘出来了... 46
49 47 模型加载成功?文件无法导入。 2023-04-15.0107 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/107 所有模型均在本地。 47
50 48 请问用的什么操作系统呢? 2023-04-16.0110 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/110 ubuntu、centos还是windows? 48
51 49 报错ModuleNotFoundError: No module named 'configs.model_config' 2023-04-17.0112 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/112 更新代码后,运行webui.py,报错ModuleNotFoundError: No module named 'configs.model_config'。未查得解决方法。 49
52 50 问特定问题会出现爆显存 2023-04-17.0116 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/116 正常提问没问题。 50
53 51 loading进不去? 2023-04-18.0127 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/127 在linux系统上python webui.py之后打开网页,一直在loading,是不是跟我没装detectron2有关呢? 51
54 52 本地知识内容数量限制? 2023-04-18.0129 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/129 本地知识文件类型是txt,超过5条以上的数据,提问的时候就爆显存了。 52
55 53 我本来也计划做一个类似的产品,看来不用从头开始做了 2023-04-18.0130 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/130 文本切割,还有优化空间吗?微信群已经加不进去了。 53
56 54 load model failed. 加载模型失败 2023-04-18.0132 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/132 ``` 54
57 55 如何在webui里回答时同时返回引用的本地数据内容? 2023-04-18.0133 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/133 如题 55
58 56 交流群满200人加不了了,能不能给个负责人的联系方式拉我进群? 2023-04-20.0143 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/143 同求 56
59 57 No sentence-transformers model found with name ‘/text2vec/‘,但是再路径下面确实有模型文件 2023-04-20.0145 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/145 另外:The dtype of attention mask (torch.int64) is not bool 57
60 58 请问加载模型的路径在哪里修改,默认好像前面会带上transformers_modules. 2023-04-20.0148 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/148 <img width="1181" alt="1681977897052" src="https://user-images.githubusercontent.com/30926001/233301106-3846680a-d842-41d2-874e-5b6514d732c4.png"> 58
61 59 为啥放到方法调用会出错,这个怎么处理? 2023-04-20.0150 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/150 ```python 59
62 60 No sentence-transformers model found with name C:\Users\Administrator/.cache\torch\sentence_transformers\GanymedeNil_text2vec-large-chinese. Creating a new one with MEAN pooling. 2023-04-21.0154 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/154 卡在这块很久是正常现象吗 60
63 61 微信群需要邀请才能加入 2023-04-21.0155 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/155 RT,给个个人联系方式白 61
64 62 No sentence-transformers model found with name GanymedeNil/text2vec-large-chinese. Creating a new one with MEAN pooling 2023-04-21.0156 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/156 ls GanymedeNil/text2vec-large-chinese 62
65 63 embedding会加载两次 2023-04-23.0159 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/159 你好,为什么要这样设置呢,这样会加载两次呀。 63
66 64 扫二维码加的那个群,群成员满了进不去了 2023-04-23.0160 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/160 如题 64
67 65 执行python3 cli_demo.py 报错AttributeError: 'NoneType' object has no attribute 'chat' 2023-04-24.0163 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/163 刚开始怀疑是内存不足问题,换成int4,int4-qe也不行,有人知道是什么原因吗 65
68 66 匹配得分 2023-04-24.0167 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/167 在示例cli_demo.py中返回的匹配文本没有对应的score,可以加上这个feature吗 66
69 67 大佬有计划往web_ui.py加入打字机功能吗 2023-04-25.0170 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/170 目前在载入了知识库后,单张V100 32G在回答垂直领域的问题时也需要20S以上,没有打字机逐字输出的使用体验还是比较煎熬的.... 67
70 68 Is it possible to use a verctorDB for the embedings? 2023-04-25.0171 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/171 when I play, I have to load the local data again and again when to start. I wonder if it is possible to use 68
71 69 请问通过lora训练官方模型得到的微调模型文件该如何加载? 2023-04-25.0173 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/173 通过lora训练的方式得到以下文件: 69
72 70 from langchain.chains import RetrievalQA的代码在哪里? 2023-04-25.0174 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/174 local_doc_qa.py 70
73 71 哪里有knowledge_based_chatglm.py文件?怎么找不到了??是被替换成cli_demo.py文件了吗? 2023-04-26.0175 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/175 哪里有knowledge_based_chatglm.py文件?怎么找不到了??是被替换成cli_demo.py文件了吗? 71
74 72 AttributeError: 'Chatbot' object has no attribute 'value' 2023-04-26.0177 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/177 Traceback (most recent call last): 72
75 73 控制台调api.py报警告 2023-04-26.0178 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/178 you must pass the application as an import string to enable "reload" or "workers" 73
76 74 如何加入群聊 2023-04-27.0183 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/183 微信群超过200人了,需要邀请,如何加入呢? 74
77 75 如何将Chatglm和本地知识相结合 2023-04-27.0185 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/185 您好,我想请教一下怎么才能让知识库匹配到的文本和chatglm生成的相结合,而不是说如果没搜索到,就说根据已知信息无法回答该问题,谢谢 75
78 76 一点建议 2023-04-27.0189 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/189 1.weiui的get_vector_store方法里面添加一个判断以兼容gradio版本导致的上传异常 76
79 77 windows环境下,按照教程,配置好conda环境,git完项目,修改完模型路径相关内容后,运行demo报错缺少 2023-04-28.0194 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/194 报错代码如下: 77
80 78 ValueError: too many values to unpack (expected 2) 2023-04-28.0198 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/198 When i tried to use the non-streaming, `ValueError: too many values to unpack (expected 2)` error came out. 78
81 79 加载doc后覆盖原本知识 2023-04-28.0201 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/201 加载较大量级的私有知识库后,原本的知识会被覆盖 79
82 80 自定义知识库回答效果很差 2023-04-28.0203 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/203 请问加了自定义知识库知识库,回答效果很差,是因为数据量太小的原因么 80
83 81 python310下,安装pycocotools失败,提示低版本cython,实际已安装高版本 2023-04-29.0208 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/208 RT,纯离线环境安装,依赖安装的十分艰难,最后碰到pycocotools,始终无法安装上,求教方法! 81
84 82 [FEATURE] 支持 RWKV 模型(目前已有 pip package & rwkv.cpp 等等) 2023-05-01.0216 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/216 您好,我是 RWKV 的作者,介绍见:https://zhuanlan.zhihu.com/p/626083366 82
85 83 [BUG] 为啥主机/服务器不联网不能正常启动服务? 2023-05-02.0220 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/220 **问题描述 / Problem Description** 83
86 84 [BUG] 简洁阐述问题 / Concise description of the issue 2023-05-03.0222 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/222 **local variable 'torch' referenced before assignment** 84
87 85 不支持txt文件的中文输入 2023-05-04.0235 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/235 vs_path, _ = local_doc_qa.init_knowledge_vector_store(filepath) 85
88 86 文件均未成功加载,请检查依赖包或替换为其他文件再次上传。 文件未成功加载,请重新上传文件 2023-05-05.0237 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/237 请大佬帮忙解决,谢谢! 86
89 87 [BUG] 使用多卡时chatglm模型加载两次 2023-05-05.0241 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/241 chatglm_llm.py文件下第129行先加载了一次chatglm模型,第143行又加载了一次 87
90 88 [BUG] similarity_search_with_score_by_vector函数返回多个doc时的score结果错误 2023-05-06.0252 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/252 **问题描述 / Problem Description** 88
91 89 可以再建一个交流群吗,这个群满了进不去。 2023-05-06.0255 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/255 上午应该已经在readme里更新过了,如果不能添加可能是网页缓存问题,可以试试看直接扫描img/qr_code_12.jpg 89
92 90 请问这是什么错误哇?KeyError: 'serialized_input' 2023-05-06.0257 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/257 运行“python webui.py” 后这是什么错误?怎么解决啊? 90
93 91 修改哪里的代码,可以再cpu上跑? 2023-05-06.0258 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/258 **问题描述 / Problem Description** 91
94 92 ModuleNotFoundError: No module named 'modelscope' 2023-05-07.0266 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/266 安装这个 92
95 93 加载lora微调模型时,lora参数加载成功,但显示模型未成功加载? 2023-05-08.0270 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/270 什么原因呀? 93
96 94 [BUG] 运行webui.py报错:name 'EMBEDDING_DEVICE' is not defined 2023-05-08.0274 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/274 解决了,我修改model_config时候把这个变量改错了 94
97 95 基于ptuning训练完成,新老模型都进行了加载,但是只有新的 2023-05-08.0280 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/280 licitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. 95
98 96 [BUG] 使用chatyuan模型时,对话Error,has no attribute 'stream_chat' 2023-05-08.0282 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/282 **问题描述 / Problem Description** 96
99 97 chaglm调用过程中 _call提示有一个 stop 2023-05-09.0286 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/286 **功能描述 / Feature Description** 97
100 98 Logger._log() got an unexpected keyword argument 'end' 2023-05-10.0295 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/295 使用cli_demo的时候,加载一个普通txt文件,输入问题后,报错:“TypeError: Logger._log() got an unexpected keyword argument 'end'” 98
101 99 [BUG] 请问可以解释下这个FAISS.similarity_search_with_score_by_vector = similarity_search_with_score_by_vector的目的吗 2023-05-10.0296 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/296 我不太明白这个库自己写的similarity_search_with_score_by_vector方法做的事情,因为langchain原版的similarity_search_with_score_by_vector只是search faiss之后把返回的topk句子组合起来。我觉得原版理解起来没什么问题,但是这个库里自己写的我就没太看明白多做了什么其他的事情,因为没有注释。 99
102 100 [BUG] Windows下上传中文文件名文件,faiss无法生成向量数据库文件 2023-05-11.0318 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/318 **问题描述 / Problem Description** 100
103 101 cli_demo中的流式输出能否接着前一答案输出? 2023-05-11.0320 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/320 现有流式输出结果样式为: 101
104 102 内网部署时网页无法加载,能否增加离线静态资源 2023-05-12.0326 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/326 内网部署时网页无法加载,能否增加离线静态资源 102
105 103 我想把文件字符的编码格式改为encoding='utf-8'在哪修改呢,因为会有ascii codec can't decode byte报错 2023-05-14.0360 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/360 上传中文的txt文件时报错,编码格式为utf-8 103
106 104 Batches的进度条是在哪里设置的?能否关闭显示? 2023-05-15.0366 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/366 使用cli_demo.py进行命令行测试时,每句回答前都有个Batches的进度条 104
107 105 ImportError: dlopen: cannot load any more object with static TLS or Segmentation fault 2023-05-15.0368 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/368 **问题描述 / Problem Description** 105
108 106 读取PDF时报错 2023-05-16.0373 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/373 在Colab上执行cli_demo.py时,在路径文件夹里放了pdf文件,在加载的过程中会显示错误,然后无法加载PDF文件 106
109 107 [BUG] webui报错 InvalidURL 2023-05-16.0375 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/375 python 版本:3.8.16 107
110 108 [FEATURE] 如果让回答不包含出处,应该怎么处理 2023-05-16.0380 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/380 **功能描述 / Feature Description** 108
111 109 加载PDF文件时,出现 unsupported colorspace for 'png' 2023-05-16.0381 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/381 **问题描述 / Problem Description** 109
112 110 'ascii' codec can't encode characters in position 14-44: ordinal not in range(128) 经典bug 2023-05-16.0382 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/382 添加了知识库之后进行对话,之后再新增知识库就会出现这个问题。 110
113 111 微信群人数超过200了,扫码进不去了,群主可以再创建一个新群吗 2023-05-17.0391 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/391 **功能描述 / Feature Description** 111
114 112 TypeError: 'ListDocsResponse' object is not subscriptable 2023-05-17.0393 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/393 应该是用remain_docs.code和remain_docs.data吧?吗? 112
115 113 [BUG] 加载chatglm模型报错:'NoneType' object has no attribute 'message_types_by_name' 2023-05-17.0398 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/398 **问题描述 / Problem Description** 113
116 114 [BUG] 执行 python webui.py 没有报错,但是ui界面提示 Something went wrong Expecting value: line 1 column 1 (char 0 2023-05-18.0399 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/399 **环境配置** 114
117 115 启动后调用api接口正常,过一会就不断的爆出 Since the angle classifier is not initialized 2023-05-18.0404 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/404 **问题描述 / Problem Description** 115
118 116 [BUG] write_check_file方法中,open函数未指定编码 2023-05-18.0408 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/408 def write_check_file(filepath, docs): 116
119 117 导入的PDF中存在图片,有大概率出现 “unsupported colorspace for 'png'”异常 2023-05-18.0409 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/409 pix = fitz.Pixmap(doc, img[0]) 117
120 118 请问流程图是用什么软件画的 2023-05-18.0410 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/410 draw.io 118
121 119 mac 加载模型失败 2023-05-19.0417 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/417 Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. 119
122 120 使用GPU本地运行知识库问答,提问第一个问题出现异常。 2023-05-20.0419 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/419 配置文件model_config.py为: 120
123 121 想加入讨论群 2023-05-20.0420 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/420 OK 121
124 122 有没有直接调用LLM的API,目前只有知识库的API? 2023-05-22.0426 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/426 ------------------------------------------------------------------------------- 122
125 123 上传文件后出现 ERROR __init__() got an unexpected keyword argument 'autodetect_encoding' 2023-05-22.0428 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/428 上传文件后出现这个问题:ERROR 2023-05-22 11:46:19,568-1d: __init__() got an unexpected keyword argument 'autodetect_encoding' 123
126 124 想问下README中用到的流程图用什么软件画的 2023-05-22.0431 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/431 **功能描述 / Feature Description** 124
127 125 No matching distribution found for langchain==0.0.174 2023-05-23.0436 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/436 ERROR: Could not find a version that satisfies the requirement langchain==0.0.174 125
128 126 [FEATURE] bing是必须的么? 2023-05-23.0437 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/437 从这个[脚步](https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/configs/model_config.py#L129)里面发现需要申请bing api,如果不申请,纯用模型推理不可吗? 126
129 127 同一台环境下部署了5.22号更新的langchain-chatglm v0.1.13和之前的版本,回复速度明显变慢 2023-05-23.0442 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/442 新langchain-chatglm v0.1.13版本速度很慢 127
130 128 Error reported during startup 2023-05-23.0443 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/443 Traceback (most recent call last): 128
131 129 ValueError: not enough values to unpack (expected 2, got 1)on of the issue 2023-05-24.0449 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/449 File ".cache\huggingface\modules\transformers_modules\chatglm-6b-int4\modeling_chatglm.py", line 1280, in chat 129
132 130 [BUG] API部署,流式输出的函数,少了个question 2023-05-24.0451 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/451 **问题描述 / Problem Description** 130
133 131 项目结构的简洁性保持 2023-05-24.0454 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/454 **功能描述 / Feature Description** 131
134 132 项目群扫码进不去了 2023-05-24.0455 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/455 项目群扫码进不去了,是否可以加一下微信拉我进群,谢谢!微信号:daniel-0527 132
135 133 请求拉我入群讨论,海硕一枚,专注于LLM等相关技术 2023-05-24.0461 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/461 **功能描述 / Feature Description** 133
136 134 [BUG] chatglm-6b模型报错OSError: Error no file named pytorch_model.bin found in directory /chatGLM/model/model-6b 2023-05-26.0474 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/474 **1、简述:** 134
137 135 现在本项目交流群二维码扫描不进去了,需要群主通过 2023-05-27.0478 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/478 现在本项目交流群二维码扫描不进去了,需要群主通过 135
138 136 RuntimeError: Only Tensors of floating point and complex dtype can require gradients 2023-05-28.0483 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/483 刚更新了最新版本: 136
139 137 RuntimeError: "LayerNormKernelImpl" not implemented for 'Half' 2023-05-28.0484 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/484 已经解决了 params 只用两个参数 {'trust_remote_code': True, 'torch_dtype': torch.float16} 137
140 138 [BUG] 文件未成功加载,请重新上传文件 2023-05-31.0504 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/504 webui.py 138
141 139 [BUG] bug 17 ,pdf和pdf为啥还不一样呢?为啥有的pdf能识别?有的pdf识别不了呢? 2023-05-31.0506 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/506 bug 17 ,pdf和pdf为啥还不一样呢?为啥有的pdf能识别?有的pdf识别不了呢? 139
142 140 [FEATURE] 简洁阐述功能 / Concise description of the feature 2023-05-31.0513 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/513 **功能描述 / Feature Description** 140
143 141 [BUG] webui.py 加载chatglm-6b-int4 失败 2023-06-02.0524 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/524 **问题描述 / Problem Description** 141
144 142 [BUG] webui.py 加载chatglm-6b模型异常 2023-06-02.0525 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/525 **问题描述 / Problem Description** 142
145 143 增加对chatgpt的embedding和api调用的支持 2023-06-02.0531 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/531 能否支持openai的embedding api和对话的api? 143
146 144 [FEATURE] 调整模型下载的位置 2023-06-02.0537 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/537 模型默认下载到 $HOME/.cache/huggingface/,当 C 盘空间不足时无法完成模型的下载。configs/model_config.py 中也没有调整模型位置的参数。 144
147 145 [BUG] langchain=0.0.174 出错 2023-06-04.0543 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/543 **问题描述 / Problem Description** 145
148 146 [BUG] 更新后加载本地模型路径不正确 2023-06-05.0545 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/545 **问题描述 / Problem Description** 146
149 147 SystemError: 8bit 模型需要 CUDA 支持,或者改用量化后模型! 2023-06-06.0550 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/550 docker 部署后,启动docker,过会儿容器会自动退出,logs报错 SystemError: 8bit 模型需要 CUDA 支持,或者改用量化后模型! [NVIDIA Container Toolkit](https://github.com/NVIDIA/nvidia-container-toolkit) 也已经安装了 147
150 148 [BUG] 上传知识库超过1M报错 2023-06-06.0556 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/556 **问题描述 / Problem Description** 148
151 149 打开跨域访问后仍然报错,不能请求 2023-06-06.0560 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/560 报错信息: 149
152 150 dialogue_answering 里面的代码是不是没有用到?,没有看到调用 2023-06-07.0571 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/571 dialogue_answering 是干啥的 150
153 151 [BUG] 响应速度极慢,应从哪里入手优化?48C/128G/8卡 2023-06-07.0573 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/573 运行环境:ubuntu20.04 151
154 152 纯CPU环境下运行cli_demo时报错,提示找不到nvcuda.dll 2023-06-08.0576 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/576 本地部署环境是纯CPU,之前的版本在纯CPU环境下能正常运行,但上传本地知识库经常出现encode问题。今天重新git项目后,运行时出现如下问题,请问该如何解决。 152
155 153 如何加载本地的embedding模型(text2vec-large-chinese模型文件) 2023-06-08.0582 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/582 因为需要离线部署,所以要把模型放到本地,我修改了chains/local_doc_qa.py中的HuggingFaceEmbeddings(),在其中加了一个cache_folder的参数,保证下载的文件在cache_folder中,model_name是text2vec-large-chinese。如cache_folder='/home/xx/model/text2vec-large-chinese', model_name='text2vec-large-chinese',这样仍然需要联网下载报错,请问大佬如何解决该问题? 153
156 154 ChatGLM-6B 在另外服务器安装好了,请问如何修改model.cofnig.py 来使用它的接口呢?? 2023-06-09.0588 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/588 我本来想在这加一个api base url 但是运行web.py 发现 还是会去连huggingface 下载模型 154
157 155 [BUG] raise partially initialized module 'charset_normalizer' has no attribute 'md__mypyc' when call interface `upload_file` 2023-06-10.0591 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/591 **问题描述 / Problem Description** 155
158 156 [BUG] raise OSError: [Errno 101] Network is unreachable when call interface upload_file and upload .pdf files 2023-06-10.0592 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/592 **问题描述 / Problem Description** 156
159 157 如果直接用vicuna作为基座大模型,需要修改的地方有哪些? 2023-06-12.0596 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/596 vicuna模型有直接转换好的没有?也就是llama转换之后的vicuna。 157
160 158 [BUG] 通过cli.py调用api时抛出AttributeError: 'NoneType' object has no attribute 'get'错误 2023-06-12.0598 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/598 通过`python cli.py start api --ip localhost --port 8001` 命令调用api时,抛出: 158
161 159 [BUG] 通过cli.py调用api时直接报错`langchain-ChatGLM: error: unrecognized arguments: start cli` 2023-06-12.0601 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/601 通过python cli.py start cli启动cli_demo时,报错: 159
162 160 [BUG] error: unrecognized arguments: --model-dir conf/models/ 2023-06-12.0602 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/602 关键字参数修改了吗?有没有文档啊?大佬 160
163 161 [BUG] 上传文件全部失败 2023-06-12.0603 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/603 ERROR: Exception in ASGI application 161
164 162 [BUG] config 使用 chatyuan 无法启动 2023-06-12.0604 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/604 "chatyuan": { 162
165 163 使用fashchat api之后,后台报错APIError 如图所示 2023-06-12.0606 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/606 我按照https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/docs/fastchat.md 163
166 164 [BUG] 启用上下文关联,每次embedding搜索到的内容都会比前一次多一段 2023-06-13.0613 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/613 **问题描述 / Problem Description** 164
167 165 local_doc_qa.py中MyFAISS.from_documents() 这个语句看不太懂。MyFAISS类中没有这个方法,其父类FAISS和VectorStore中也只有from_texts方法[BUG] 简洁阐述问题 / Concise description of the issue 2023-06-14.0619 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/619 local_doc_qa.py中MyFAISS.from_documents() 这个语句看不太懂。MyFAISS类中没有这个方法,其父类FAISS和VectorStore中也只有from_texts方法 165
168 166 [BUG] TypeError: similarity_search_with_score_by_vector() got an unexpected keyword argument 'filter' 2023-06-14.0624 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/624 **问题描述 / Problem Description** 166
169 167 please delete this issue 2023-06-15.0633 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/633 sorry, incorrect submission. Please remove this issue! 167
170 168 [BUG] vue前端镜像构建失败 2023-06-15.0635 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/635 **问题描述 / Problem Description** 168
171 169 ChatGLM-6B模型能否回答英文问题? 2023-06-15.0640 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/640 大佬,请问一下,如果本地知识文档是英文,ChatGLM-6B模型能否回答英文问题?不能的话,有没有替代的模型推荐,期待你的回复,谢谢 169
172 170 [BUG] 简洁阐述问题 / Concise description of the issue 2023-06-16.0644 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/644 **问题描述 / Problem Description** 170
173 171 KeyError: 3224 2023-06-16.0645 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/645 ``` 171

View File

@ -0,0 +1,172 @@
{"title": "加油~以及一些建议", "file": "2023-03-31.0002", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/2", "detail": "加油,我认为你的方向是对的。", "id": 0}
{"title": "当前的运行环境是什么windows还是Linux", "file": "2023-04-01.0003", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/3", "detail": "当前的运行环境是什么windows还是Linuxpython是什么版本", "id": 1}
{"title": "请问这是在CLM基础上运行吗", "file": "2023-04-01.0004", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/4", "detail": "请问是不是需要本地安装好clm并正常运行的情况下再按文中的步骤执行才能运行起来", "id": 2}
{"title": "[复现问题] 构造 prompt 时从知识库中提取的文字乱码", "file": "2023-04-01.0005", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/5", "detail": "hi我在尝试复现 README 中的效果,也使用了 ChatGLM-6B 的 README 作为输入文本,但发现从知识库中提取的文字是乱码,导致构造的 prompt 不可用。想了解如何解决这个问题。", "id": 3}
{"title": "后面能否加入上下文对话功能?", "file": "2023-04-02.0006", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/6", "detail": "目前的get_wiki_agent_answer函数中已经实现了历史消息传递的功能后面我再确认一下是否有langchain中model调用过程中是否传递了chat_history。", "id": 4}
{"title": "请问纯cpu可以吗", "file": "2023-04-03.0007", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/7", "detail": "很酷的实现极大地开拓了我的眼界很顺利的在gpu机器上运行了", "id": 5}
{"title": "运行报错AttributeError: 'NoneType' object has no attribute 'message_types_by_name'", "file": "2023-04-03.0008", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/8", "detail": "报错:", "id": 6}
{"title": "运行环境GPU需要多大的", "file": "2023-04-03.0009", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/9", "detail": "如果按照THUDM/ChatGLM-6B的说法使用的GPU大小应该在13GB左右但运行脚本后占用了24GB还不够。", "id": 7}
{"title": "请问本地知识的格式是什么?", "file": "2023-04-03.0010", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/10", "detail": "已测试格式包括docx、md文件中的文本信息具体格式可以参考 [langchain文档](https://python.langchain.com/en/latest/modules/indexes/document_loaders/examples/unstructured_file.html?highlight=pdf#)", "id": 8}
{"title": "24G的显存还是爆掉了是否支持双卡运行", "file": "2023-04-03.0011", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/11", "detail": "RuntimeError: CUDA out of memory. Tried to allocate 96.00 MiB (GPU 0; 23.70 GiB total capacity; 22.18 GiB already allocated; 12.75 MiB free; 22.18 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF", "id": 9}
{"title": "你怎么知道embeddings方式和模型训练时候的方式是一样的?", "file": "2023-04-03.0012", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/12", "detail": "embedding和LLM的方式不用一致embedding能够解决语义检索的需求就行。这个项目里用到embedding是在对本地知识建立索引和对问句转换成向量的过程。", "id": 10}
{"title": "是否能提供本地知识文件的格式?", "file": "2023-04-04.0013", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/13", "detail": "是否能提供本地知识文件的格式?", "id": 11}
{"title": "是否可以像清华原版跑在8G一以下的卡", "file": "2023-04-04.0016", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/16", "detail": "是否可以像清华原版跑在8G一以下的卡我的8G卡爆显存了🤣🤣🤣", "id": 12}
{"title": "请教一下langchain协调使用向量库和chatGLM工作的", "file": "2023-04-05.0018", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/18", "detail": "代码里面这段是创建问答模型的会接入ChatGLM和本地语料的向量库langchain回答的时候是怎么个优先顺序先搜向量库没有再找chatglm么 还是什么机制?", "id": 13}
{"title": "在mac m2max上抛出了ValueError: 150001 is not in list这个异常", "file": "2023-04-05.0019", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/19", "detail": "我把chatglm_llm.py加载模型的代码改成如下", "id": 14}
{"title": "程序运行后一直卡住", "file": "2023-04-05.0020", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/20", "detail": "感谢作者的付出,不过本人在运行时出现了问题,请大家帮助。", "id": 15}
{"title": "问一下chat_history的逻辑", "file": "2023-04-06.0022", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/22", "detail": "感谢开源。", "id": 16}
{"title": "为什么每次运行都会loading checkpoint", "file": "2023-04-06.0023", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/23", "detail": "我把这个embeding模型下载到本地后无法正常启动。", "id": 17}
{"title": "本地知识文件能否上传一些示例?", "file": "2023-04-06.0025", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/25", "detail": "如题,怎么构造知识文件,效果更好?能否提供一个样例", "id": 18}
{"title": "What version of you are using?", "file": "2023-04-06.0026", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/26", "detail": "Hi Panda, I saw the `pip install -r requirements` command in README, and want to confirm you are using python2 or python3? because my pip and pip3 version are all is 22.3.", "id": 19}
{"title": "有兴趣交流本项目应用的朋友可以加一下微信群", "file": "2023-04-07.0027", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/27", "detail": "![IMG_1630](https://user-images.githubusercontent.com/5668498/230533162-8b9bfcdd-249c-4efe-b066-4f9ba2ce9f23.jpeg)", "id": 20}
{"title": "本地知识越多,回答时检索的时间是否会越长", "file": "2023-04-07.0029", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/29", "detail": "是的 因为需要进行向量匹配检索", "id": 21}
{"title": "爲啥最後還是報錯 哭。。", "file": "2023-04-07.0030", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/30", "detail": "Failed to import transformers.models.t5.configuration_t5 because of the following error (look up to see", "id": 22}
{"title": "对话到第二次的时候就报错UnicodeDecodeError: 'utf-8' codec can't decode", "file": "2023-04-07.0031", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/31", "detail": "对话第一次是没问题的,模型返回输出后又给到请输入你的问题,我再输入问题就报错", "id": 23}
{"title": "用的in4的量化版本推理的时候显示需要申请10Gb的显存", "file": "2023-04-07.0033", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/33", "detail": "File \"/root/.cache/huggingface/modules/transformers_modules/chatglm-6b-int4-qe/modeling_chatglm.py\", line 581, in forward", "id": 24}
{"title": "使用colab运行python3.9,提示包导入有问题", "file": "2023-04-07.0034", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/34", "detail": "from ._util import is_directory, is_path", "id": 25}
{"title": "运行失败Loading checkpoint未达到100%被kill了请问下是什么原因", "file": "2023-04-07.0035", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/35", "detail": "日志如下:", "id": 26}
{"title": "弄了个交流群,自己弄好多细节不会,大家技术讨论 加connection-image 我来拉你", "file": "2023-04-08.0036", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/36", "detail": "自己搞好多不清楚的,一起来弄吧。。准备搞个部署问题的解决文档出来", "id": 27}
{"title": "Error using the new version with langchain", "file": "2023-04-09.0043", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/43", "detail": "Error with the new changes:", "id": 28}
{"title": "程序报错torch.cuda.OutOfMemoryError如何解决", "file": "2023-04-10.0044", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/44", "detail": "报错详细信息如下:", "id": 29}
{"title": "qa的训练数据格式是如何设置的", "file": "2023-04-10.0045", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/45", "detail": "本项目不是使用微调的方式,所以并不涉及到训练过程。", "id": 30}
{"title": "The FileType.UNK file type is not supported in partition. 解决办法", "file": "2023-04-10.0046", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/46", "detail": "ValueError: Invalid file /home/yawu/Documents/langchain-ChatGLM-master/data. The FileType.UNK file type is not supported in partition.", "id": 31}
{"title": "如何读取多个txt文档", "file": "2023-04-10.0047", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/47", "detail": "如题请教一下如何读取多个txt文档示例代码中只给了读一个文档的案例这个input我换成string之后也只能指定一个文档无法用通配符指定多个文档也无法传入多个文件路径的列表。", "id": 32}
{"title": "nltk package unable to either download or load local nltk_data folder", "file": "2023-04-10.0049", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/49", "detail": "I'm running this project on an offline Windows Server environment so I download the Punkt and averaged_perceptron_tagger tokenizer in this directory:", "id": 33}
{"title": "requirements.txt中需要指定langchain版本", "file": "2023-04-11.0055", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/55", "detail": "langchain版本0.116下无法引入RetrievalQA需要指定更高版本0.136版本下无问题)", "id": 34}
{"title": "Demo演示无法给出输出内容", "file": "2023-04-12.0059", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/59", "detail": "你好测试了项目自带新闻稿示例和自行上传的一个文本可以加载进去但是无法给出答案请问属于什么情况如何解决谢谢。PS: 1、今天早上刚下载全部代码2、硬件服务器满足要求3、按操作说明正常操作。", "id": 35}
{"title": "群人数过多无法进群,求帮忙拉进群", "file": "2023-04-12.0061", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/61", "detail": "您好您的群人数超过了200人目前无法通过二维码加群请问您方便加我微信拉我进群吗万分感谢", "id": 36}
{"title": "群人数已满,求大佬拉入群", "file": "2023-04-12.0062", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/62", "detail": "已在README中更新拉群二维码", "id": 37}
{"title": "requirements中langchain版本错误", "file": "2023-04-12.0065", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/65", "detail": "langchain版本应该是0.0.12而不是0.0.120", "id": 38}
{"title": "Linux : Searchd in", "file": "2023-04-13.0068", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/68", "detail": "import nltk", "id": 39}
{"title": "No sentence-transformers model found", "file": "2023-04-13.0069", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/69", "detail": "加载不了这个模型,错误原因是找不到这个模型,但是路径是配置好了的", "id": 40}
{"title": "Error loading punkt: <urlopen error [Errno 111] Connection", "file": "2023-04-13.0070", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/70", "detail": "运行knowledge_based_chatglm.py出现nltk报错具体情况如下", "id": 41}
{"title": "[不懂就问] ptuning数据集格式", "file": "2023-04-13.0072", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/72", "detail": "大家好请教 微调数据集的格式有什么玄机吗?我看 ChatGLM-6B/ptuning/readme.md的demo数据集ADGEN里content为啥都写成 类型#裙*风格#简约 这种格式的?这里面有啥玄机的? 特此请教", "id": 42}
{"title": "Embedding model请教", "file": "2023-04-13.0074", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/74", "detail": "您好我看到项目里的embedding模型用的是GanymedeNil/text2vec-large-chinese请问这个项目里的embedding模型可以直接用ChatGLM嘛", "id": 43}
{"title": "Macbook M1 运行 webui.py 时报错请问是否可支持M系列芯片", "file": "2023-04-13.0080", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/80", "detail": "```", "id": 44}
{"title": "new feature: 添加对P-tunningv2微调后的模型支持", "file": "2023-04-14.0099", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/99", "detail": "能否添加新的功能,对使用[P-tunningv2](https://github.com/THUDM/ChatGLM-6B/tree/main/ptuning)微调chatglm后的模型提供加载支持", "id": 45}
{"title": "txt文件加载成功但读取报错", "file": "2023-04-15.0106", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/106", "detail": "最新版的代码。比较诡异的是我的电脑是没有D盘的报错信息里怎么有个D盘出来了...", "id": 46}
{"title": "模型加载成功?文件无法导入。", "file": "2023-04-15.0107", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/107", "detail": "所有模型均在本地。", "id": 47}
{"title": "请问用的什么操作系统呢?", "file": "2023-04-16.0110", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/110", "detail": "ubuntu、centos还是windows", "id": 48}
{"title": "报错ModuleNotFoundError: No module named 'configs.model_config'", "file": "2023-04-17.0112", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/112", "detail": "更新代码后运行webui.py报错ModuleNotFoundError: No module named 'configs.model_config'。未查得解决方法。", "id": 49}
{"title": "问特定问题会出现爆显存", "file": "2023-04-17.0116", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/116", "detail": "正常提问没问题。", "id": 50}
{"title": "loading进不去", "file": "2023-04-18.0127", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/127", "detail": "在linux系统上python webui.py之后打开网页一直在loading是不是跟我没装detectron2有关呢", "id": 51}
{"title": "本地知识内容数量限制?", "file": "2023-04-18.0129", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/129", "detail": "本地知识文件类型是txt超过5条以上的数据提问的时候就爆显存了。", "id": 52}
{"title": "我本来也计划做一个类似的产品,看来不用从头开始做了", "file": "2023-04-18.0130", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/130", "detail": "文本切割,还有优化空间吗?微信群已经加不进去了。", "id": 53}
{"title": "load model failed. 加载模型失败", "file": "2023-04-18.0132", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/132", "detail": "```", "id": 54}
{"title": "如何在webui里回答时同时返回引用的本地数据内容", "file": "2023-04-18.0133", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/133", "detail": "如题", "id": 55}
{"title": "交流群满200人加不了了能不能给个负责人的联系方式拉我进群", "file": "2023-04-20.0143", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/143", "detail": "同求", "id": 56}
{"title": "No sentence-transformers model found with name /text2vec/‘,但是再路径下面确实有模型文件", "file": "2023-04-20.0145", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/145", "detail": "另外The dtype of attention mask (torch.int64) is not bool", "id": 57}
{"title": "请问加载模型的路径在哪里修改默认好像前面会带上transformers_modules.", "file": "2023-04-20.0148", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/148", "detail": "<img width=\"1181\" alt=\"1681977897052\" src=\"https://user-images.githubusercontent.com/30926001/233301106-3846680a-d842-41d2-874e-5b6514d732c4.png\">", "id": 58}
{"title": "为啥放到方法调用会出错,这个怎么处理?", "file": "2023-04-20.0150", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/150", "detail": "```python", "id": 59}
{"title": "No sentence-transformers model found with name C:\\Users\\Administrator/.cache\\torch\\sentence_transformers\\GanymedeNil_text2vec-large-chinese. Creating a new one with MEAN pooling.", "file": "2023-04-21.0154", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/154", "detail": "卡在这块很久是正常现象吗", "id": 60}
{"title": "微信群需要邀请才能加入", "file": "2023-04-21.0155", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/155", "detail": "RT给个个人联系方式白", "id": 61}
{"title": "No sentence-transformers model found with name GanymedeNil/text2vec-large-chinese. Creating a new one with MEAN pooling", "file": "2023-04-21.0156", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/156", "detail": "ls GanymedeNil/text2vec-large-chinese", "id": 62}
{"title": "embedding会加载两次", "file": "2023-04-23.0159", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/159", "detail": "你好,为什么要这样设置呢,这样会加载两次呀。", "id": 63}
{"title": "扫二维码加的那个群,群成员满了进不去了", "file": "2023-04-23.0160", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/160", "detail": "如题", "id": 64}
{"title": "执行python3 cli_demo.py 报错AttributeError: 'NoneType' object has no attribute 'chat'", "file": "2023-04-24.0163", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/163", "detail": "刚开始怀疑是内存不足问题换成int4,int4-qe也不行有人知道是什么原因吗", "id": 65}
{"title": "匹配得分", "file": "2023-04-24.0167", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/167", "detail": "在示例cli_demo.py中返回的匹配文本没有对应的score可以加上这个feature吗", "id": 66}
{"title": "大佬有计划往web_ui.py加入打字机功能吗", "file": "2023-04-25.0170", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/170", "detail": "目前在载入了知识库后单张V100 32G在回答垂直领域的问题时也需要20S以上没有打字机逐字输出的使用体验还是比较煎熬的....", "id": 67}
{"title": "Is it possible to use a verctorDB for the embedings?", "file": "2023-04-25.0171", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/171", "detail": "when I play, I have to load the local data again and again when to start. I wonder if it is possible to use", "id": 68}
{"title": "请问通过lora训练官方模型得到的微调模型文件该如何加载", "file": "2023-04-25.0173", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/173", "detail": "通过lora训练的方式得到以下文件:", "id": 69}
{"title": "from langchain.chains import RetrievalQA的代码在哪里", "file": "2023-04-25.0174", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/174", "detail": "local_doc_qa.py", "id": 70}
{"title": "哪里有knowledge_based_chatglm.py文件怎么找不到了是被替换成cli_demo.py文件了吗", "file": "2023-04-26.0175", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/175", "detail": "哪里有knowledge_based_chatglm.py文件怎么找不到了是被替换成cli_demo.py文件了吗", "id": 71}
{"title": "AttributeError: 'Chatbot' object has no attribute 'value'", "file": "2023-04-26.0177", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/177", "detail": "Traceback (most recent call last):", "id": 72}
{"title": "控制台调api.py报警告", "file": "2023-04-26.0178", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/178", "detail": "you must pass the application as an import string to enable \"reload\" or \"workers\"", "id": 73}
{"title": "如何加入群聊", "file": "2023-04-27.0183", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/183", "detail": "微信群超过200人了需要邀请如何加入呢", "id": 74}
{"title": "如何将Chatglm和本地知识相结合", "file": "2023-04-27.0185", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/185", "detail": "您好我想请教一下怎么才能让知识库匹配到的文本和chatglm生成的相结合而不是说如果没搜索到就说根据已知信息无法回答该问题谢谢", "id": 75}
{"title": "一点建议", "file": "2023-04-27.0189", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/189", "detail": "1.weiui的get_vector_store方法里面添加一个判断以兼容gradio版本导致的上传异常", "id": 76}
{"title": "windows环境下按照教程配置好conda环境git完项目修改完模型路径相关内容后运行demo报错缺少", "file": "2023-04-28.0194", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/194", "detail": "报错代码如下:", "id": 77}
{"title": "ValueError: too many values to unpack (expected 2)", "file": "2023-04-28.0198", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/198", "detail": "When i tried to use the non-streaming, `ValueError: too many values to unpack (expected 2)` error came out.", "id": 78}
{"title": "加载doc后覆盖原本知识", "file": "2023-04-28.0201", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/201", "detail": "加载较大量级的私有知识库后,原本的知识会被覆盖", "id": 79}
{"title": "自定义知识库回答效果很差", "file": "2023-04-28.0203", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/203", "detail": "请问加了自定义知识库知识库,回答效果很差,是因为数据量太小的原因么", "id": 80}
{"title": "python310下安装pycocotools失败提示低版本cython实际已安装高版本", "file": "2023-04-29.0208", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/208", "detail": "RT纯离线环境安装依赖安装的十分艰难最后碰到pycocotools始终无法安装上求教方法", "id": 81}
{"title": "[FEATURE] 支持 RWKV 模型(目前已有 pip package & rwkv.cpp 等等)", "file": "2023-05-01.0216", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/216", "detail": "您好,我是 RWKV 的作者介绍见https://zhuanlan.zhihu.com/p/626083366", "id": 82}
{"title": "[BUG] 为啥主机/服务器不联网不能正常启动服务?", "file": "2023-05-02.0220", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/220", "detail": "**问题描述 / Problem Description**", "id": 83}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-05-03.0222", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/222", "detail": "**local variable 'torch' referenced before assignment**", "id": 84}
{"title": "不支持txt文件的中文输入", "file": "2023-05-04.0235", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/235", "detail": "vs_path, _ = local_doc_qa.init_knowledge_vector_store(filepath)", "id": 85}
{"title": "文件均未成功加载,请检查依赖包或替换为其他文件再次上传。 文件未成功加载,请重新上传文件", "file": "2023-05-05.0237", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/237", "detail": "请大佬帮忙解决,谢谢!", "id": 86}
{"title": "[BUG] 使用多卡时chatglm模型加载两次", "file": "2023-05-05.0241", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/241", "detail": "chatglm_llm.py文件下第129行先加载了一次chatglm模型第143行又加载了一次", "id": 87}
{"title": "[BUG] similarity_search_with_score_by_vector函数返回多个doc时的score结果错误", "file": "2023-05-06.0252", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/252", "detail": "**问题描述 / Problem Description**", "id": 88}
{"title": "可以再建一个交流群吗,这个群满了进不去。", "file": "2023-05-06.0255", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/255", "detail": "上午应该已经在readme里更新过了如果不能添加可能是网页缓存问题可以试试看直接扫描img/qr_code_12.jpg", "id": 89}
{"title": "请问这是什么错误哇KeyError: 'serialized_input'", "file": "2023-05-06.0257", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/257", "detail": "运行“python webui.py” 后这是什么错误?怎么解决啊?", "id": 90}
{"title": "修改哪里的代码可以再cpu上跑", "file": "2023-05-06.0258", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/258", "detail": "**问题描述 / Problem Description**", "id": 91}
{"title": "ModuleNotFoundError: No module named 'modelscope'", "file": "2023-05-07.0266", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/266", "detail": "安装这个", "id": 92}
{"title": "加载lora微调模型时lora参数加载成功但显示模型未成功加载", "file": "2023-05-08.0270", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/270", "detail": "什么原因呀?", "id": 93}
{"title": "[BUG] 运行webui.py报错name 'EMBEDDING_DEVICE' is not defined", "file": "2023-05-08.0274", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/274", "detail": "解决了我修改model_config时候把这个变量改错了", "id": 94}
{"title": "基于ptuning训练完成新老模型都进行了加载但是只有新的", "file": "2023-05-08.0280", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/280", "detail": "licitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.", "id": 95}
{"title": "[BUG] 使用chatyuan模型时对话Errorhas no attribute 'stream_chat'", "file": "2023-05-08.0282", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/282", "detail": "**问题描述 / Problem Description**", "id": 96}
{"title": "chaglm调用过程中 _call提示有一个 stop", "file": "2023-05-09.0286", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/286", "detail": "**功能描述 / Feature Description**", "id": 97}
{"title": "Logger._log() got an unexpected keyword argument 'end'", "file": "2023-05-10.0295", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/295", "detail": "使用cli_demo的时候加载一个普通txt文件输入问题后报错“TypeError: Logger._log() got an unexpected keyword argument 'end'”", "id": 98}
{"title": "[BUG] 请问可以解释下这个FAISS.similarity_search_with_score_by_vector = similarity_search_with_score_by_vector的目的吗", "file": "2023-05-10.0296", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/296", "detail": "我不太明白这个库自己写的similarity_search_with_score_by_vector方法做的事情因为langchain原版的similarity_search_with_score_by_vector只是search faiss之后把返回的topk句子组合起来。我觉得原版理解起来没什么问题但是这个库里自己写的我就没太看明白多做了什么其他的事情因为没有注释。", "id": 99}
{"title": "[BUG] Windows下上传中文文件名文件faiss无法生成向量数据库文件", "file": "2023-05-11.0318", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/318", "detail": "**问题描述 / Problem Description**", "id": 100}
{"title": "cli_demo中的流式输出能否接着前一答案输出?", "file": "2023-05-11.0320", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/320", "detail": "现有流式输出结果样式为:", "id": 101}
{"title": "内网部署时网页无法加载,能否增加离线静态资源", "file": "2023-05-12.0326", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/326", "detail": "内网部署时网页无法加载,能否增加离线静态资源", "id": 102}
{"title": "我想把文件字符的编码格式改为encoding='utf-8'在哪修改呢因为会有ascii codec can't decode byte报错", "file": "2023-05-14.0360", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/360", "detail": "上传中文的txt文件时报错编码格式为utf-8", "id": 103}
{"title": "Batches的进度条是在哪里设置的?能否关闭显示?", "file": "2023-05-15.0366", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/366", "detail": "使用cli_demo.py进行命令行测试时,每句回答前都有个Batches的进度条", "id": 104}
{"title": "ImportError: dlopen: cannot load any more object with static TLS or Segmentation fault", "file": "2023-05-15.0368", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/368", "detail": "**问题描述 / Problem Description**", "id": 105}
{"title": "读取PDF时报错", "file": "2023-05-16.0373", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/373", "detail": "在Colab上执行cli_demo.py时在路径文件夹里放了pdf文件在加载的过程中会显示错误然后无法加载PDF文件", "id": 106}
{"title": "[BUG] webui报错 InvalidURL", "file": "2023-05-16.0375", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/375", "detail": "python 版本3.8.16", "id": 107}
{"title": "[FEATURE] 如果让回答不包含出处,应该怎么处理", "file": "2023-05-16.0380", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/380", "detail": "**功能描述 / Feature Description**", "id": 108}
{"title": "加载PDF文件时出现 unsupported colorspace for 'png'", "file": "2023-05-16.0381", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/381", "detail": "**问题描述 / Problem Description**", "id": 109}
{"title": "'ascii' codec can't encode characters in position 14-44: ordinal not in range(128) 经典bug", "file": "2023-05-16.0382", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/382", "detail": "添加了知识库之后进行对话,之后再新增知识库就会出现这个问题。", "id": 110}
{"title": "微信群人数超过200了扫码进不去了群主可以再创建一个新群吗", "file": "2023-05-17.0391", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/391", "detail": "**功能描述 / Feature Description**", "id": 111}
{"title": "TypeError: 'ListDocsResponse' object is not subscriptable", "file": "2023-05-17.0393", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/393", "detail": "应该是用remain_docs.code和remain_docs.data吧", "id": 112}
{"title": "[BUG] 加载chatglm模型报错'NoneType' object has no attribute 'message_types_by_name'", "file": "2023-05-17.0398", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/398", "detail": "**问题描述 / Problem Description**", "id": 113}
{"title": "[BUG] 执行 python webui.py 没有报错但是ui界面提示 Something went wrong Expecting value: line 1 column 1 (char 0", "file": "2023-05-18.0399", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/399", "detail": "**环境配置**", "id": 114}
{"title": "启动后调用api接口正常过一会就不断的爆出 Since the angle classifier is not initialized", "file": "2023-05-18.0404", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/404", "detail": "**问题描述 / Problem Description**", "id": 115}
{"title": "[BUG] write_check_file方法中open函数未指定编码", "file": "2023-05-18.0408", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/408", "detail": "def write_check_file(filepath, docs):", "id": 116}
{"title": "导入的PDF中存在图片有大概率出现 “unsupported colorspace for 'png'”异常", "file": "2023-05-18.0409", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/409", "detail": "pix = fitz.Pixmap(doc, img[0])", "id": 117}
{"title": "请问流程图是用什么软件画的", "file": "2023-05-18.0410", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/410", "detail": "draw.io", "id": 118}
{"title": "mac 加载模型失败", "file": "2023-05-19.0417", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/417", "detail": "Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.", "id": 119}
{"title": "使用GPU本地运行知识库问答提问第一个问题出现异常。", "file": "2023-05-20.0419", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/419", "detail": "配置文件model_config.py为", "id": 120}
{"title": "想加入讨论群", "file": "2023-05-20.0420", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/420", "detail": "OK", "id": 121}
{"title": "有没有直接调用LLM的API目前只有知识库的API", "file": "2023-05-22.0426", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/426", "detail": "-------------------------------------------------------------------------------", "id": 122}
{"title": "上传文件后出现 ERROR __init__() got an unexpected keyword argument 'autodetect_encoding'", "file": "2023-05-22.0428", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/428", "detail": "上传文件后出现这个问题ERROR 2023-05-22 11:46:19,568-1d: __init__() got an unexpected keyword argument 'autodetect_encoding'", "id": 123}
{"title": "想问下README中用到的流程图用什么软件画的", "file": "2023-05-22.0431", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/431", "detail": "**功能描述 / Feature Description**", "id": 124}
{"title": "No matching distribution found for langchain==0.0.174", "file": "2023-05-23.0436", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/436", "detail": "ERROR: Could not find a version that satisfies the requirement langchain==0.0.174 ", "id": 125}
{"title": "[FEATURE] bing是必须的么", "file": "2023-05-23.0437", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/437", "detail": "从这个[脚步](https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/configs/model_config.py#L129)里面发现需要申请bing api如果不申请纯用模型推理不可吗", "id": 126}
{"title": "同一台环境下部署了5.22号更新的langchain-chatglm v0.1.13和之前的版本,回复速度明显变慢", "file": "2023-05-23.0442", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/442", "detail": "新langchain-chatglm v0.1.13版本速度很慢", "id": 127}
{"title": "Error reported during startup", "file": "2023-05-23.0443", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/443", "detail": "Traceback (most recent call last):", "id": 128}
{"title": "ValueError: not enough values to unpack (expected 2, got 1)on of the issue", "file": "2023-05-24.0449", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/449", "detail": "File \".cache\\huggingface\\modules\\transformers_modules\\chatglm-6b-int4\\modeling_chatglm.py\", line 1280, in chat", "id": 129}
{"title": "[BUG] API部署流式输出的函数少了个question", "file": "2023-05-24.0451", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/451", "detail": "**问题描述 / Problem Description**", "id": 130}
{"title": "项目结构的简洁性保持", "file": "2023-05-24.0454", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/454", "detail": "**功能描述 / Feature Description**", "id": 131}
{"title": "项目群扫码进不去了", "file": "2023-05-24.0455", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/455", "detail": "项目群扫码进不去了是否可以加一下微信拉我进群谢谢微信号daniel-0527", "id": 132}
{"title": "请求拉我入群讨论海硕一枚专注于LLM等相关技术", "file": "2023-05-24.0461", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/461", "detail": "**功能描述 / Feature Description**", "id": 133}
{"title": "[BUG] chatglm-6b模型报错OSError: Error no file named pytorch_model.bin found in directory /chatGLM/model/model-6b", "file": "2023-05-26.0474", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/474", "detail": "**1、简述**", "id": 134}
{"title": "现在本项目交流群二维码扫描不进去了,需要群主通过", "file": "2023-05-27.0478", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/478", "detail": "现在本项目交流群二维码扫描不进去了,需要群主通过", "id": 135}
{"title": "RuntimeError: Only Tensors of floating point and complex dtype can require gradients", "file": "2023-05-28.0483", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/483", "detail": "刚更新了最新版本:", "id": 136}
{"title": "RuntimeError: \"LayerNormKernelImpl\" not implemented for 'Half'", "file": "2023-05-28.0484", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/484", "detail": "已经解决了 params 只用两个参数 {'trust_remote_code': True, 'torch_dtype': torch.float16}", "id": 137}
{"title": "[BUG] 文件未成功加载,请重新上传文件", "file": "2023-05-31.0504", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/504", "detail": "webui.py", "id": 138}
{"title": "[BUG] bug 17 pdf和pdf为啥还不一样呢为啥有的pdf能识别有的pdf识别不了呢", "file": "2023-05-31.0506", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/506", "detail": "bug 17 pdf和pdf为啥还不一样呢为啥有的pdf能识别有的pdf识别不了呢", "id": 139}
{"title": "[FEATURE] 简洁阐述功能 / Concise description of the feature", "file": "2023-05-31.0513", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/513", "detail": "**功能描述 / Feature Description**", "id": 140}
{"title": "[BUG] webui.py 加载chatglm-6b-int4 失败", "file": "2023-06-02.0524", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/524", "detail": "**问题描述 / Problem Description**", "id": 141}
{"title": "[BUG] webui.py 加载chatglm-6b模型异常", "file": "2023-06-02.0525", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/525", "detail": "**问题描述 / Problem Description**", "id": 142}
{"title": "增加对chatgpt的embedding和api调用的支持", "file": "2023-06-02.0531", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/531", "detail": "能否支持openai的embedding api和对话的api", "id": 143}
{"title": "[FEATURE] 调整模型下载的位置", "file": "2023-06-02.0537", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/537", "detail": "模型默认下载到 $HOME/.cache/huggingface/,当 C 盘空间不足时无法完成模型的下载。configs/model_config.py 中也没有调整模型位置的参数。", "id": 144}
{"title": "[BUG] langchain=0.0.174 出错", "file": "2023-06-04.0543", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/543", "detail": "**问题描述 / Problem Description**", "id": 145}
{"title": "[BUG] 更新后加载本地模型路径不正确", "file": "2023-06-05.0545", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/545", "detail": "**问题描述 / Problem Description**", "id": 146}
{"title": "SystemError: 8bit 模型需要 CUDA 支持,或者改用量化后模型!", "file": "2023-06-06.0550", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/550", "detail": "docker 部署后,启动docker,过会儿容器会自动退出,logs报错 SystemError: 8bit 模型需要 CUDA 支持,或者改用量化后模型! [NVIDIA Container Toolkit](https://github.com/NVIDIA/nvidia-container-toolkit) 也已经安装了", "id": 147}
{"title": "[BUG] 上传知识库超过1M报错", "file": "2023-06-06.0556", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/556", "detail": "**问题描述 / Problem Description**", "id": 148}
{"title": "打开跨域访问后仍然报错,不能请求", "file": "2023-06-06.0560", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/560", "detail": "报错信息:", "id": 149}
{"title": "dialogue_answering 里面的代码是不是没有用到?,没有看到调用", "file": "2023-06-07.0571", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/571", "detail": "dialogue_answering 是干啥的", "id": 150}
{"title": "[BUG] 响应速度极慢应从哪里入手优化48C/128G/8卡", "file": "2023-06-07.0573", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/573", "detail": "运行环境ubuntu20.04", "id": 151}
{"title": "纯CPU环境下运行cli_demo时报错提示找不到nvcuda.dll", "file": "2023-06-08.0576", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/576", "detail": "本地部署环境是纯CPU之前的版本在纯CPU环境下能正常运行但上传本地知识库经常出现encode问题。今天重新git项目后运行时出现如下问题请问该如何解决。", "id": 152}
{"title": "如何加载本地的embedding模型text2vec-large-chinese模型文件", "file": "2023-06-08.0582", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/582", "detail": "因为需要离线部署所以要把模型放到本地我修改了chains/local_doc_qa.py中的HuggingFaceEmbeddings()在其中加了一个cache_folder的参数保证下载的文件在cache_folder中model_name是text2vec-large-chinese。如cache_folder='/home/xx/model/text2vec-large-chinese', model_name='text2vec-large-chinese',这样仍然需要联网下载报错,请问大佬如何解决该问题?", "id": 153}
{"title": "ChatGLM-6B 在另外服务器安装好了请问如何修改model.cofnig.py 来使用它的接口呢??", "file": "2023-06-09.0588", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/588", "detail": "我本来想在这加一个api base url 但是运行web.py 发现 还是会去连huggingface 下载模型", "id": 154}
{"title": "[BUG] raise partially initialized module 'charset_normalizer' has no attribute 'md__mypyc' when call interface `upload_file`", "file": "2023-06-10.0591", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/591", "detail": "**问题描述 / Problem Description**", "id": 155}
{"title": "[BUG] raise OSError: [Errno 101] Network is unreachable when call interface upload_file and upload .pdf files", "file": "2023-06-10.0592", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/592", "detail": "**问题描述 / Problem Description**", "id": 156}
{"title": "如果直接用vicuna作为基座大模型需要修改的地方有哪些", "file": "2023-06-12.0596", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/596", "detail": "vicuna模型有直接转换好的没有也就是llama转换之后的vicuna。", "id": 157}
{"title": "[BUG] 通过cli.py调用api时抛出AttributeError: 'NoneType' object has no attribute 'get'错误", "file": "2023-06-12.0598", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/598", "detail": "通过`python cli.py start api --ip localhost --port 8001` 命令调用api时抛出", "id": 158}
{"title": "[BUG] 通过cli.py调用api时直接报错`langchain-ChatGLM: error: unrecognized arguments: start cli`", "file": "2023-06-12.0601", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/601", "detail": "通过python cli.py start cli启动cli_demo时报错", "id": 159}
{"title": "[BUG] error: unrecognized arguments: --model-dir conf/models/", "file": "2023-06-12.0602", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/602", "detail": "关键字参数修改了吗?有没有文档啊?大佬", "id": 160}
{"title": "[BUG] 上传文件全部失败", "file": "2023-06-12.0603", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/603", "detail": "ERROR: Exception in ASGI application", "id": 161}
{"title": "[BUG] config 使用 chatyuan 无法启动", "file": "2023-06-12.0604", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/604", "detail": "\"chatyuan\": {", "id": 162}
{"title": "使用fashchat api之后后台报错APIError 如图所示", "file": "2023-06-12.0606", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/606", "detail": "我按照https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/docs/fastchat.md", "id": 163}
{"title": "[BUG] 启用上下文关联每次embedding搜索到的内容都会比前一次多一段", "file": "2023-06-13.0613", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/613", "detail": "**问题描述 / Problem Description**", "id": 164}
{"title": "local_doc_qa.py中MyFAISS.from_documents() 这个语句看不太懂。MyFAISS类中没有这个方法其父类FAISS和VectorStore中也只有from_texts方法[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-06-14.0619", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/619", "detail": "local_doc_qa.py中MyFAISS.from_documents() 这个语句看不太懂。MyFAISS类中没有这个方法其父类FAISS和VectorStore中也只有from_texts方法", "id": 165}
{"title": "[BUG] TypeError: similarity_search_with_score_by_vector() got an unexpected keyword argument 'filter'", "file": "2023-06-14.0624", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/624", "detail": "**问题描述 / Problem Description**", "id": 166}
{"title": "please delete this issue", "file": "2023-06-15.0633", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/633", "detail": "sorry, incorrect submission. Please remove this issue!", "id": 167}
{"title": "[BUG] vue前端镜像构建失败", "file": "2023-06-15.0635", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/635", "detail": "**问题描述 / Problem Description**", "id": 168}
{"title": "ChatGLM-6B模型能否回答英文问题", "file": "2023-06-15.0640", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/640", "detail": "大佬请问一下如果本地知识文档是英文ChatGLM-6B模型能否回答英文问题不能的话有没有替代的模型推荐期待你的回复谢谢", "id": 169}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-06-16.0644", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/644", "detail": "**问题描述 / Problem Description**", "id": 170}
{"title": "KeyError: 3224", "file": "2023-06-16.0645", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/645", "detail": "```", "id": 171}

View File

@ -0,0 +1,324 @@
,title,file,url,detail,id
0,效果如何优化,2023-04-04.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/14,如图所示将该项目的README.md和该项目结合后回答效果并不理想请问可以从哪些方面进行优化,0
1,怎么让模型严格根据检索的数据进行回答,减少胡说八道的回答呢,2023-04-04.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/15,举个例子:,1
2,"When I try to run the `python knowledge_based_chatglm.py`, I got this error in macOS(M1 Max, OS 13.2)",2023-04-07.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/32,```python,2
3,萌新求教大佬怎么改成AMD显卡或者CPU,2023-04-10.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/48,把.cuda()去掉就行,3
4,输出answer的时间很长是否可以把文本向量化的部分提前做好存储起来,2023-04-10.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/50,GPU4090 24G显存,4
5,报错Use `repo_type` argument if needed.,2023-04-11.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/57,Traceback (most recent call last):,5
6,无法打开gradio的页面,2023-04-11.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/58,$ python webui.py,6
7,支持word那word里面的图片正常显示吗,2023-04-12.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/60,如题,刚刚从隔壁转过来的,想先了解下,7
8,detectron2 is not installed. Cannot use the hi_res partitioning strategy. Falling back to partitioning with the fast strategy.,2023-04-12.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/63,能够正常的跑起来在加载content文件夹中的文件时每加载一个文件都会提示,8
9,cpu上运行webuistep3 asking时报错,2023-04-12.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/66,web运行文件加载都正常asking时报错,9
10,建议弄一个插件系统,2023-04-13.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/67,如题弄成stable-diffusion-webui那种能装插件再开一个存储库给使用者或插件开发存储或下载插件。,10
11,请教加载模型出错!?,2023-04-13.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/75,AttributeError: module 'transformers_modules.chatglm-6b.configuration_chatglm' has no attribute 'ChatGLMConfig 怎么解决呀,11
12,从本地知识检索内容的时候是否可以设置相似度阈值小于这个阈值的内容不返回即使会小于设置的VECTOR_SEARCH_TOP_K参数呢谢谢大佬,2023-04-13.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/76,比如 问一些 你好/你是谁 等一些跟本地知识库无关的问题,12
13,如何改成多卡推理?,2023-04-13.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/77,+1,13
14,能否弄个懒人包,可以一键体验?,2023-04-13.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/78,能否弄个懒人包,可以一键体验?,14
15,连续问问题会导致崩溃,2023-04-13.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/79,看上去不是爆内存的问题,连续问问题后,会出现如下报错,15
16,AttributeError: 'NoneType' object has no attribute 'as_retriever',2023-04-14.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/86,"环境windows 11, anaconda/python 3.8",16
17,FileNotFoundError: Could not find module 'nvcuda.dll' (or one of its dependencies). Try using the full path with constructor syntax.,2023-04-14.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/87,请检查一下cuda或cudnn是否存在安装问题,17
18,加载txt文件失败,2023-04-14.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/89,![JppHrGOWFa](https://user-images.githubusercontent.com/109277248/232009383-bf7c46d1-a01e-4e0a-9de6-5b5ed3e36158.jpg),18
19,NameError: name 'chatglm' is not defined,2023-04-14.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/90,"This share link expires in 72 hours. For free permanent hosting and GPU upgrades (NEW!), check out Spaces: https://huggingface.co/spaces",19
20,打不开地址?,2023-04-14.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/91,报错数据如下:,20
21,加载md文件出错,2023-04-14.00,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/98,运行 webui.py后能访问页面上传一个md文件后日志中有错误。等待后能加载完成提示可以提问了但提问没反应日志中有错误。 具体日志如下。,21
22,建议增加获取在线知识的能力,2023-04-15.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/101,建议增加获取在线知识的能力,22
23,txt 未能成功加载,2023-04-15.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/103,hinese. Creating a new one with MEAN pooling.,23
24,pdf加载失败,2023-04-15.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/105,e:\a.txt加载成功了e:\a.pdf加载就失败pdf文件里面前面几页是图片后面都是文字加载失败没有报更多错误请问该怎么排查,24
25,一直停在文本加载处,2023-04-15.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/108,一直停在文本加载处,25
26," File ""/root/.cache/huggingface/modules/transformers_modules/chatglm-6b/modeling_chatglm.py"", line 440, in forward new_tensor_shape = mixed_raw_layer.size()[:-1] + ( TypeError: torch.Size() takes an iterable of 'int' (item 2 is 'float')",2023-04-17.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/113,按照最新的代码,发现,26
27,后续会提供前后端分离的功能吗?,2023-04-17.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/114,类似这种https://github.com/lm-sys/FastChat/tree/main/fastchat/serve,27
28,安装依赖报错,2023-04-17.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/115,(test) C:\Users\linh\Desktop\langchain-ChatGLM-master>pip install -r requirements.txt,28
29,问特定问题会出现爆显存,2023-04-17.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/117,正常提问没问题。,29
30,Expecting value: line 1 column 1 (char 0),2023-04-17.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/118,运行后 第一步加载配置一直报错:,30
31,embedding https://huggingface.co/GanymedeNil/text2vec-large-chinese/tree/main是免费的效果比对openai的如何,2023-04-17.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/119,-------------------------------------------------------------------------------,31
32,这是什么错误在Colab上运行的。,2023-04-17.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/120,libcuda.so.1: cannot open shared object file: No such file or directory,32
33,只想用自己的lora微调后的模型进行对话不想加载任何本地文档该如何调整,2023-04-18.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/121,能出一个单独的教程吗,33
34,"租的gpu,Running on local URL: http://0.0.0.0:7860 To create a public link, set `share=True` in `launch()`. 浏览器上访问不了???",2023-04-18.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/122,(chatglm20230401) root@autodl-container-e82d11963c-10ece0d7:~/autodl-tmp/chatglm/langchain-ChatGLM-20230418# python3.9 webui.py,34
35,本地部署中的报错请教,2023-04-18.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/124,"您好在本地运行langchain-ChatGLM过程中环境及依赖的包都已经满足条件但是运行webui.py,报错如下运行cli_demo.py报错类似请问是哪里出了错呢盼望您的回复谢谢",35
36,报错。The dtype of attention mask (torch.int64) is not bool,2023-04-18.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/131,The dtype of attention mask (torch.int64) is not bool,36
37,[求助] pip install -r requirements.txt 的时候出现以下报错。。。有大佬帮忙看看怎么搞么下的release里面的包,2023-04-18.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/134,$ pip install -r requirements.txt,37
38,如何提升根据问题搜索到对应知识的准确率,2023-04-19.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/136,外链知识库最大的问题在于问题是短文本,知识是中长文本。如何根据问题精准的搜索到对应的知识是个最大的问题。这类本地化项目不像百度,由无数的网页,基本上每个问题都可以找到对应的页面。,38
39,是否可以增加向量召回的阈值设定,有些召回内容相关性太低,导致模型胡言乱语,2023-04-20.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/140,如题,39
40,输入长度问题,2023-04-20.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/141,感谢作者支持ptuning微调模型。,40
41,已有部署好的chatGLM-6b如何通过接口接入,2023-04-20.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/144,已有部署好的chatGLM-6b如何通过接口接入而不是重新加载一个模型,41
42,执行web_demo.py后显示Killed就退出了是不是配置不足呢,2023-04-20.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/146,![图片](https://user-images.githubusercontent.com/26102866/233256425-c7aab999-11d7-4de9-867b-23ef18d519e4.png),42
43,执行python cli_demo1.py,2023-04-20.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/147,Traceback (most recent call last):,43
44,报错ImportError: cannot import name 'GENERATION_CONFIG_NAME' from 'transformers.utils',2023-04-20.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/149,(mychatGLM) PS D:\Users\admin3\zrh\langchain-ChatGLM> python cli_demo.py,44
45,上传文件并加载知识库时,会不停地出现临时文件,2023-04-21.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/153,环境ubuntu 18.04,45
46,向知识库中添加文件后点击”上传文件并加载知识库“后Segmentation fault报错。,2023-04-23.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/161,运行服务后的提示如下:,46
47,langchain-serve 集成,2023-04-24.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/162,Hey 我是来自 [langchain-serve](https://github.com/jina-ai/langchain-serve) 的dev,47
48,大佬们wsl的ubuntu怎么配置用cuda加速装了运行后发现是cpu在跑,2023-04-24.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/164,大佬们wsl的ubuntu怎么配置用cuda加速装了运行后发现是cpu在跑,48
49,在github codespaces docker运行出错,2023-04-24.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/165,docker run -d --restart=always --name chatglm -p 7860:7860 -v /www/wwwroot/code/langchain-ChatGLM:/chatGLM chatglm,49
50,有计划接入Moss模型嘛,2023-04-24.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/166,后续会开展测试目前主要在优化langchain部分效果如果有兴趣也欢迎提PR,50
51,怎么实现 API 部署?,2023-04-24.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/168,利用 fastapi 实现 API 部署方式,具体怎么实现,有方法说明吗?,51
52, 'NoneType' object has no attribute 'message_types_by_name'报错,2023-04-24.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/169,_HISTOGRAMPROTO = DESCRIPTOR.message_types_by_name['HistogramProto'],52
53,能否指定自己训练的text2vector模型,2023-04-25.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/172,请问大佬:,53
54,关于项目支持的模型以及quantization_bit潜在的影响的问题,2023-04-26.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/176,作者您好~,54
55,运行python3.9 api.py WARNING: You must pass the application as an import string to enable 'reload' or 'workers'.,2023-04-26.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/179,api.py文件最下面改成这样试试,55
56,ValidationError: 1 validation error for HuggingFaceEmbeddings model_kwargs extra fields not permitted (type=value_error.extra),2023-04-26.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/180,ValidationError: 1 validation error for HuggingFaceEmbeddings,56
57,如果没有检索到相关性比较高的,回答“我不知道”,2023-04-26.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/181,如果通过设计system_template让模型在搜索到的文档都不太相关的情况下回答“我不知道”,57
58,请问如果不能联网6B之类的文件从本地上传需要放到哪里,2023-04-26.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/182,感谢大佬的项目,很有启发~,58
59,知识库问答--输入新的知识库名称是中文的话会报error,2023-04-27.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/184,知识库问答--输入新的知识库名称是中文的话会报error选择要加载的知识库那里也不显示之前添加的知识库,59
60,现在能通过问题匹配的相似度值,来直接返回文档中的文段,而不经过模型吗?因为有些答案在文档中,模型自己回答,不能回答文档中的答案,2023-04-27.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/186,现在能通过问题匹配的相似度值,来直接返回文档中的文段,而不经过模型吗?因为有些答案在文档中,模型自己回答,不能回答文档中的答案。也就是说,提供向量检索回答+模型回答相结合的策略。如果相似度值高于一定数值,直接返回文档中的文本,没有高于就返回模型的回答或者不知道,60
61,"TypeError: The type of ChatGLM.callback_manager differs from the new default value; if you wish to change the type of this field, please use a type annotation",2023-04-27.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/188,"Mac 运行 python3 ./webui.py 报 TypeError: The type of ChatGLM.callback_manager differs from the new default value; if you wish to change the type of this field, please use a type annotation",61
62,Not Enough Memory,2023-04-27.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/190,"运行命令行程序python cli_demo.py 已经成功加载pdf文件, 报“DefaultCPUAllocator: not enough memory: you tried to allocate 458288380900 bytes”错误请问哪里可以配置default memory",62
63,参与开发问题,2023-04-27.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/191,1.是否需要进专门的开发群,63
64,对话框中代码片段格式需改进,2023-04-27.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/192,最好能改进下输出代码片段的格式,目前输出的格式还不友好。,64
65,请问未来有可能支持belle吗,2023-04-28.01,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/195,如题,谢谢大佬,65
66,TypeError: cannot unpack non-iterable NoneType object,2023-04-28.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/200,"When i tried to change the knowledge vector store through `init_knowledge_vector_store`, the error `TypeError: cannot unpack non-iterable NoneType object` came out.",66
67,生成结果,2023-04-28.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/202,你好想问一下langchain+chatglm-6B找到相似匹配的prompt是直接返回prompt对应的答案信息还是chatglm-6B在此基础上自己优化答案,67
68,在win、ubuntu下都出现这个错误attributeerror: 't5forconditionalgeneration' object has no attribute 'stream_chat',2023-04-29.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/207,在win、ubuntu。下载完模型后没办法修改代码以执行本地模型每次都要重新输入路径 LLM 模型、Embedding 模型支持也都在官网下的在其他项目wenda下可以使用,68
69,[FEATURE] knowledge_based_chatglm.py: renamed or missing?,2023-04-30.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/210,"Not found. Was it renamed? Or, is it missing? How can I get it?",69
70,sudo apt-get install -y nvidia-container-toolkit-base执行报错,2023-05-01.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/211,**问题描述 / Problem Description**,70
71,效果不佳几乎答不上来,2023-05-01.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/212,提供了50条问答的docx文件,71
72,有没有可能新增一个基于chatglm api调用的方式构建langchain,2023-05-02.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/218,我有两台8G GPU/40G内存的服务器一个台做成了chatglm的api 想基于另外一台服务器部署langchain网上好像没有类似的代码。,72
73,电脑是intel的集成显卡 运行时告知我找不到nvcuda.dll模型无法运行,2023-05-02.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/219,您好我的电脑是intel的集成显卡不过CPU是i5-11400 @ 2.60GHz 内存64G,73
74,根据langchain官方的文档和使用模式是否可以改Faiss为Elasticsearch会需要做哪些额外调整求解,2023-05-03.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/221,本人新手小白由于业务模式的原因有一些自己的场景和优化希望利用Elasticsearch做这个体系内部的检索机制不知道是否可以替换同时还会涉及到哪些地方的改动或者说可能会有哪些其他影响希望作者和大佬们不吝赐教,74
75,请问未来有可能支持t5吗,2023-05-04.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/224,请问可能支持基於t5的模型吗?,75
76,[BUG] 内存溢出 / torch.cuda.OutOfMemoryError:,2023-05-04.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/229,**问题描述 / Problem Description**,76
77,报错 No module named 'chatglm_llm',2023-05-04.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/230,明明已经安装了包却在python里吊不出来,77
78,能出一个api部署的描述文档吗,2023-05-04.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/233,**功能描述 / Feature Description**,78
79,使用docs/API.md 出错,2023-05-04.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/234,使用API.md文档2种方法出错,79
80,加载pdf文档报错,2023-05-05.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/238,ew one with MEAN pooling.,80
81,上传的本地知识文件后再次上传不能显示,只显示成功了一个,别的上传成功后再次刷新就没了,2023-05-05.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/239,您好,项目有很大启发,感谢~,81
82,创建了新的虚拟环境安装了相关包并且自动下载了相关的模型但是仍旧出现OSError: Unable to load weights from pytorch checkpoint file for,2023-05-05.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/240,![78ac8e663fdc312d0e9d78da95925c4](https://user-images.githubusercontent.com/34124260/236378728-9ea4424f-0f7f-4013-9d33-820b723de321.png),82
83,[BUG] 数据加载不进来,2023-05-05.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/243,使用的.txt格式utf-8编码报以下错误,83
84,不能读取pdf,2023-05-05.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/244,请问是webui还是cli_demo,84
85,本地txt文件有500M加载的时候很慢如何提高速度,2023-05-06.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/251,![yayRzxSYHP](https://user-images.githubusercontent.com/109277248/236592902-f5ab338d-c1e9-43dc-ae16-9df2cd3c1378.jpg),85
86,[BUG] gradio上传知识库后刷新之后 知识库就不见了 只有重启才能看到之前的上传的知识库,2023-05-06.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/253,gradio上传知识库后刷新之后 知识库就不见了 只有重启才能看到之前的上传的知识库,86
87,[FEATURE] 可以支持 OpenAI 的模型嘛?比如 GPT-3、GPT-3.5、GPT-4embedding 增加 text-embedding-ada-002,2023-05-06.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/254,**功能描述 / Feature Description**,87
88,[FEATURE] 能否增加对于milvus向量数据库的支持 / Concise description of the feature,2023-05-06.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/256,**功能描述 / Feature Description**,88
89,CPU和GPU上跑除了速度有区别准确率效果回答上有区别吗,2023-05-06.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/259,理论上没有区别,89
90,m1请问在生成回答时怎么看是否使用了mps or cpu,2023-05-06.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/260,m1请问在生成回答时怎么看是否使用了mps or cpu,90
91,知识库一刷新就没了,2023-05-07.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/263,知识库上传后刷新就没了,91
92,本地部署报没有模型,2023-05-07.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/267,建议在下载llm和embedding模型至本地后在configs/model_config中写入模型本地存储路径后再运行,92
93,[BUG] python3: can't open file 'webui.py': [Errno 2] No such file or directory,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/269,**问题描述 / Problem Description**,93
94,模块缺失提示,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/271,因为已有自己使用的docker环境直接启动webui.py提示,94
95,"运行api.py后执行curl -X POST ""http://127.0.0.1:7861"" 报错?",2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/272,"执行curl -X POST ""http://127.0.0.1:7861"" \ -H 'Content-Type: application/json' \ -d '{""prompt"": ""你好"", ""history"": []}',报错怎么解决",95
96,[BUG] colab安装requirements提示protobuf版本问题,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/273,pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.,96
97,请问项目里面向量相似度使用了什么方法计算呀?,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/275,基本按照langchain里的FAISS.similarity_search_with_score_by_vector实现,97
98,[BUG] 安装detectron2后pdf无法加载,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/276,**问题描述 / Problem Description**,98
99,[BUG] 使用ChatYuan-V2模型无法流式输出会报错,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/277,一方面好像是ChatYuan本身不支持stream_chat有人在clueai那边提了issue他们说还没开发所以估计这个attribute调不起来但是另一方面看报错好像是T5模型本身就不是decoder-only模型所以不能流式输出吧个人理解,99
100,[BUG] 无法加载text2vec模型,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/278,**问题描述 / Problem Description**,100
101,请问能否增加网络搜索功能,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/281,请问能否增加网络搜索功能,101
102,[FEATURE] 结构化数据sql、excel、csv啥时会支持呐。,2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/283,**功能描述 / Feature Description**,102
103,TypeError: ChatGLM._call() got an unexpected keyword argument 'stop',2023-05-08.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/284,No sentence-transformers model found with name D:\DevProject\langchain-ChatGLM\GanymedeNil\text2vec-large-chinese. Creating a new one with MEAN pooling.,103
104,关于api.py的一些bug和设计逻辑问题,2023-05-09.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/285,首先冒昧的问一下这个api.py开发者大佬们是在自己电脑上测试后确实没问题吗,104
105,有没有租用的算力平台上运行api.py后浏览器http://localhost:7861/报错,2023-05-09.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/287,是不是租用的gpu平台上都会出现这个问题,105
106,请问一下项目中有用到文档段落切割方法吗?,2023-05-09.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/288,text_load中的文档切割方法用上了吗在代码中看好像没有用到,106
107,"报错 raise ValueError(f""Knowledge base {knowledge_base_id} not found"") ValueError: Knowledge base ./vector_store not found",2023-05-09.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/289,"File ""/root/autodl-tmp/chatglm/langchain-ChatGLM-master/api.py"", line 183, in chat",107
108,能接入vicuna模型吗,2023-05-09.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/290,目前本地已经有了vicuna模型能直接接入吗,108
109,[BUG] 提问公式相关问题大概率爆显存,2023-05-09.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/291,**问题描述 / Problem Description**,109
110,安装pycocotools失败找了好多方法都不能解决。,2023-05-10.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/292,**问题描述 / Problem Description**,110
111,使用requirements安装PyTorch安装的是CPU版本,2023-05-10.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/294,如题目使用requirements安装PyTorch安装的是CPU版本运行程序的时候也是使用CPU在工作。,111
112,能不能给一个毛坯服务器的部署教程,2023-05-10.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/298,“开发部署”你当成服务器的部署教程用就行了。,112
113, Error(s) in loading state_dict for ChatGLMForConditionalGeneration:,2023-05-10.02,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/299,运行中出现的问题7860的端口页面显示不出来求助。,113
114,ChatYuan-large-v2模型加载失败,2023-05-10.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/300,**实际结果 / Actual Result**,114
115,新增摘要功能,2023-05-10.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/303,你好,后续会考虑新增对长文本信息进行推理和语音理解功能吗?比如生成摘要,115
116,[BUG] pip install -r requirements.txt 出错,2023-05-10.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/304,pip install langchain -i https://pypi.org/simple,116
117,[BUG] 上传知识库文件报错,2023-05-10.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/305,![19621e29eaa547d01213bee53d81e6a](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/84606552/7f6ceb46-e494-4b0e-939c-23b585a6d9d8),117
118,[BUG] AssertionError: <class 'gradio.layouts.Accordion'> Component with id 41 not a valid input component.,2023-05-10.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/306,**问题描述 / Problem Description**,118
119,[BUG] CUDA out of memory with container deployment,2023-05-10.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/310,**问题描述 / Problem Description**,119
120,[FEATURE] 增加微调训练功能,2023-05-11.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/311,**功能描述 / Feature Description**,120
121,如何使用多卡部署多个gpu,2023-05-11.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/315,"机器上有多个gpu,如何全使用了",121
122,请问这个知识库问答和chatglm的关系是什么,2023-05-11.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/319,这个知识库问答哪部分关联到了chatglm是不是没有这个chatglm知识库问答也可单单拎出来,122
123,[BUG] 运行的时候报错ImportError: libcudnn.so.8: cannot open shared object file: No such file or directory,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/324,**问题描述 / Problem Description**raceback (most recent call last):,123
124,webui启动成功但有报错,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/325,**问题描述 / Problem Description**,124
125,切换MOSS的时候报错,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/327,danshi但是发布的源码中,125
126,vicuna模型是否能接入,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/328,您好关于MOSS模型和vicuna模型都是AutoModelForCausalLM来加载模型的但是稍作更改模型路径这些会报这个错误。这个错误的造成是什么,126
127,你好请问一下在阿里云CPU服务器上跑可以吗可以的话比较理想的cpu配置是什么,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/330,你好请问一下在阿里云CPU服务器上跑可以吗可以的话比较理想的cpu配置是什么,127
128,你好请问8核32g的CPU可以跑多轮对话吗,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/331,什么样的cpu配置比较好呢我目前想部署CPU下的多轮对话,128
129,[BUG] 聊天内容输入超过10000个字符系统出现错误,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/332,聊天内容输入超过10000个字符系统出现错误如下图所示,129
130,能增加API的多用户访问接口部署吗,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/333,默认部署程序仅支持单用户访问多用户则需要排队访问。测试过相关的几个Github多用户工程但是其中一些仍然不满足要求。本节将系统介绍如何实现多用户同时访问ChatGLM的部署接口包括http、websocket流式输出stream和web页面等方式主要目录如下所示。,130
131,多卡部署,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/334,用单机多卡或多机多卡fastapi部署模型怎样提高并发,131
132,WEBUI能否指定知识库目录,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/335,**功能描述 / Feature Description**,132
133,[BUG] Cannot read properties of undefined (reading 'error'),2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/336,**问题描述 / Problem Description**,133
134,[BUG] 1 validation error for HuggingFaceEmbeddings model_kwargs extra fields not permitted.,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/337,模型加载到 100% 后出现问题:,134
135,上传知识库需要重启能不能修复一下,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/338,挺严重的这个问题,135
136,[BUG] 4块v100卡爆显存在LLM会话模式也一样,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/339,**问题描述 / Problem Description**,136
137,针对上传的文件配置不同的TextSpliter,2023-05-12.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/341,1. 目前的ChineseTextSpliter切分对英文尤其是代码文件不友好而且限制固定长度导致对话结果不如人意,137
138,[FEATURE] 未来可增加Bloom系列模型吗根据甲骨易的测试这系列中文评测效果不错,2023-05-13.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/346,**功能描述 / Feature Description**,138
139,[BUG] v0.1.12打包镜像后启动webui.py失败 / Concise description of the issue,2023-05-13.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/347,**问题描述 / Problem Description**,139
140,切换MOSS模型时报错,2023-05-13.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/349,昨天问了下说是transformers版本不对需要4.30.0发现没有这个版本今天更新到4.29.1,依旧报错,错误如下,140
141,[BUG] pdf文档加载失败,2023-05-13.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/350,**问题描述 / Problem Description**,141
142,建议可以在后期增强一波注释这样也有助于更多人跟进提PR,2023-05-13.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/351,知道作者和团队在疯狂更新审查代码,只是建议后续稳定后可以把核心代码进行一些注释的补充,从而能帮助更多人了解各个模块作者的思路从而提出更好的优化。,142
143,[FEATURE] MOSS 量化版支援,2023-05-13.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/353,**功能描述 / Feature Description**,143
144,[BUG] moss模型无法加载,2023-05-13.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/356,**问题描述 / Problem Description**,144
145,[BUG] load_doc_qa.py 中的 load_file 函数有bug,2023-05-14.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/358,原函数为:,145
146,[FEATURE] API模式知识库加载优化,2023-05-14.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/359,如题,当前版本,每次调用本地知识库接口,都将加载一次知识库,是否有更好的方式?,146
147,运行Python api.py脚本后端部署后怎么使用curl命令调用,2023-05-15.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/361,也就是说,我现在想做个对话机器人,想和公司的前后端联调?怎么与前后端相互调用呢?可私信,有偿解答!!!,147
148,上传知识库需要重启能不能修复一下,2023-05-15.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/363,上传知识库需要重启能不能修复一下,148
149,[BUG] pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple,2023-05-15.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/364,我的python是3.8.5的,149
150,pip install gradio 报错,2023-05-15.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/367,大佬帮我一下,150
151,[BUG] pip install gradio 一直卡不动,2023-05-15.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/369,![aba82742dd9d4d242181662eb5027a7](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/84606552/cd9600d9-f6e7-46b7-b1be-30ed8b99f76b),151
152,[BUG] 简洁阐述问题 / Concise description of the issue,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/370,初次加载本地知识库成功,但提问后,就无法重写加载本地知识库,152
153,[FEATURE] 简洁阐述功能 / Concise description of the feature,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/371,**功能描述 / Feature Description**,153
154,在windows上模型文件默认会安装到哪,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/372,-------------------------------------------------------------------------------,154
155,[FEATURE] 兼顾对话管理,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/374,如何在知识库检索的情况下,兼顾对话管理?,155
156,llm device: cpu embedding device: cpu,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/376,**问题描述 / Problem Description**,156
157,[FEATURE] 简洁阐述功能 /文本文件的知识点之间使用什么分隔符可以分割?,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/377,**功能描述 / Feature Description**,157
158,[BUG] 上传文件失败PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/379,**问题描述 / Problem Description**,158
159,[BUG] 执行python api.py 报错,2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/383,错误信息,159
160,model_kwargs extra fields not permitted (type=value_error.extra),2023-05-16.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/384,"大家好,请问这个有遇到的么,",160
161,[BUG] 简洁阐述问题 / Concise description of the issue,2023-05-17.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/385,执行的时候出现了ls1 = [ls[0]],161
162,[FEATURE] 性能优化,2023-05-17.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/388,**功能描述 / Feature Description**,162
163,"[BUG] Moss模型问答RuntimeError: probability tensor contains either inf, nan or element < 0",2023-05-17.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/390,**问题描述 / Problem Description**,163
164,有没有人知道v100GPU的32G显存会报错吗支持V100GPU吗,2023-05-17.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/392,**问题描述 / Problem Description**,164
165,针对于编码问题比如'gbk' codec can't encode character '\xab' in position 14: illegal multibyte sequence粗浅的解决方法,2023-05-17.03,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/397,**功能描述 / Feature Description**,165
166,Could not import sentence_transformers python package. Please install it with `pip install sentence_transformers`.,2023-05-18.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/400,**问题描述 / Problem Description**,166
167,支持模型问答与检索问答,2023-05-18.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/401,不同的query根据意图不一致回答也应该不一样。,167
168,文本分割的时候能不能按照txt文件的每行进行分割也就是按照换行符号\n进行分割,2023-05-18.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/403,下面的代码应该怎么修改?,168
169,local_doc_qa/local_doc_chat 接口响应是串行,2023-05-18.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/405,**问题描述 / Problem Description**,169
170,"为什么找到出处了,但是还是无法回答该问题?",2023-05-18.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/406,![图片](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/3349611/1fc81d61-2409-4330-9065-fdda1a27c86a),170
171,"请问下:知识库测试中的:添加单条内容,如果换成文本导入是是怎样的格式?我发现添加单条内容测试效果很好.",2023-05-18.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/412,"我发现在知识库测试中`添加单条内容`,并且勾选`禁止内容分句入库`,即使 `不开启上下文关联`的测试效果都非常好.",171
172,[BUG] 无法配置知识库,2023-05-18.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/413,**问题描述 / Problem Description**,172
173,[BUG] 部署在阿里PAI平台的EAS上访问页面是白屏,2023-05-19.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/414,**问题描述 / Problem Description**,173
174,API部署后调用/local_doc_qa/local_doc_chat 返回Knowledge base samples not found,2023-05-19.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/416,入参,174
175,[FEATURE] 上传word另存为的txt文件报 'ascii' codec can't decode byte 0xb9 in position 6: ordinal not in range(128),2023-05-20.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/421,上传word另存为的txt文件报,175
176,创建保存的知识库刷新后没有出来,这个知识库是永久保存的吗?可以连外部的 向量知识库吗?,2023-05-21.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/422,创建保存的知识库刷新后没有出来,这个知识库是永久保存的吗?可以连外部的 向量知识库吗?,176
177,[BUG] 用colab运行无法加载模型报错'NoneType' object has no attribute 'message_types_by_name',2023-05-21.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/423,**问题描述 / Problem Description**,177
178,请问是否需要用到向量数据库?以及什么时候需要用到向量数据库?,2023-05-21.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/424,目前用的是 text2vec 请问是否需要用到向量数据库?以及什么时候需要用到向量数据库?,178
179,huggingface模型引用问题,2023-05-22.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/427,它最近似乎变成了一个Error,179
180,你好加载本地txt文件出现这个killed错误TXT文件有100M左右大小。原因是谢谢。,2023-05-22.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/429,"<img width=""677"" alt=""929aca3b22b8cd74e997a87b61d241b"" src=""https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/24024522-c884-4170-b5cf-a498491bd8bc"">",180
181,想请问一下关于对本地知识的管理是如何管理例如通过http API接口添加数据 或者 删除某条数据,2023-05-22.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/430,例如通过http API接口添加、删除、修改 某条数据。,181
182,[FEATURE] 双栏pdf识别问题,2023-05-22.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/432,试了一下模型感觉对单栏pdf识别的准确性较高但是由于使用的基本是ocr的技术对一些双栏pdf论文识别出来有很多问题请问有什么办法改善吗,182
183,部署启动小问题,小弟初学求大佬解答,2023-05-22.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/433,1.python loader/image_loader.py时提示ModuleNotFoundError: No module named 'configs'但是跑python webui.py还是还能跑,183
184,能否支持检测到目录下文档有增加而去增量加载文档,不影响前台对话,其实就是支持读写分离。如果能支持查询哪些文档向量化了,删除过时文档等就更好了,谢谢。,2023-05-22.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/434,**功能描述 / Feature Description**,184
185,[BUG] 简洁阐述问题 / windows 下cuda错误请用https://github.com/Keith-Hon/bitsandbytes-windows.git,2023-05-22.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/435,pip install git+https://github.com/Keith-Hon/bitsandbytes-windows.git,185
186,"[BUG] from commit 33bbb47, Required library version not found: libbitsandbytes_cuda121_nocublaslt.so. Maybe you need to compile it from source?",2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/438,**问题描述 / Problem Description**,186
187,[BUG] 简洁阐述问题 / Concise description of the issue上传60m的txt文件报错显示超时请问这个能上传的文件大小有限制吗,2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/439,"ERROR 2023-05-23 11:13:09,627-1d: Timeout reached while detecting encoding for ./docs/GLM模型格式数据.txt",187
188,[BUG] TypeError: issubclass() arg 1 must be a class,2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/440,**问题描述**,188
189,"执行python3 webui.py后一直提示”模型未成功加载请到页面左上角""模型配置""选项卡中重新选择后点击""加载模型""按钮“",2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/441,**问题描述 / Problem Description**,189
190,是否能提供网页文档得导入支持,2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/444,现在很多都是在线文档作为协作得工具所以通过URL导入在线文档需求更大,190
191,[BUG] history 索引问题,2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/445,在比较对话框的history和模型chat function 中的history时 发现并不匹配,在传入 llm._call 时history用的索引是不是有点问题导致上一轮对话的内容并不输入给模型。,191
192,[BUG] moss_llm没有实现,2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/447,有些方法没支持如history_len,192
193,请问langchain-ChatGLM如何删除一条本地知识库的数据,2023-05-23.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/448,例如:用户刚刚提交了一条错误的数据到本地知识库中了,现在如何在本地知识库从找到,并且对此删除。,193
194,[BUG] 简洁阐述问题 / UnboundLocalError: local variable 'resp' referenced before assignment,2023-05-24.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/450,"在最新一版的代码中, 运行api.py 出现了以上错误UnboundLocalError: local variable 'resp' referenced before assignment 通过debug的方式观察到local_doc_qa.llm.generatorAnswer(prompt=question, history=history,streaming=True)可能不返回任何值。",194
195,请问有没有 PROMPT_TEMPLATE 能让模型不回答敏感问题,2023-05-24.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/452,## PROMPT_TEMPLATE问题,195
196,[BUG] 测试环境 Python 版本有误,2023-05-24.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/456,**问题描述 / Problem Description**,196
197,[BUG] webui 部署后样式不正确,2023-05-24.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/458,**问题描述 / Problem Description**,197
198,配置默认LLM模型的问题,2023-05-24.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/459,**问题描述 / Problem Description**,198
199,[FEATURE]是时候更新一下autoDL的镜像了,2023-05-24.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/460,如题跑了下autoDL的镜像发现是4.27号的git pull新版本的代码功能+老的依赖环境,各种奇奇怪怪的问题。,199
200,[BUG] tag:0.1.13 以cpu模式下想使用本地模型无法跑起来各种路径参数问题,2023-05-24.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/462,-------------------------------------------------------------------------------,200
201,[BUG] 有没有同学遇到过这个错加载本地txt文件出现这个killed错误TXT文件有100M左右大小。,2023-05-25.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/463,运行cli_demo.py。是本地的txt文件太大了吗100M左右。,201
202,API版本能否提供WEBSOCKET的流式接口,2023-05-25.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/464,webui 版本中采用了WS的流式输出整体感知反应很快,202
203,[BUG] 安装bug记录,2023-05-25.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/465,按照[install文档](https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/docs/INSTALL.md)安装的,,203
204,VUE的pnmp i执行失败的修复-用npm i命令即可,2023-05-25.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/466,感谢作者!非常棒的应用,用的很开心。,204
205,请教个问题有没有人知道cuda11.4是否支持???,2023-05-25.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/467,请教个问题有没有人知道cuda11.4是否支持???,205
206,请问有实现多轮问答中基于问题的搜索上下文关联么,2023-05-25.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/468,在基于知识库的多轮问答中,第一个问题讲述了一个主题,后续的问题描述没有包含这个主题的关键词,但又存在上下文的关联。如果用后续问题去搜索知识库有可能会搜索出无关的信息,从而导致大模型无法正确回答问题。请问这个项目要考虑这种情况吗?,206
207,[BUG] 内存不足的问题,2023-05-26.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/470,我用了本地的chatglm-6b-int4模型然后显示了内存不足win10+32G内存+1080ti11G一般需要多少内存才足够这个bug应该如何解决,207
208,[BUG] 纯内网环境安装pycocotools失败,2023-05-26.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/472,**问题描述 / Problem Description**,208
209,[BUG] webui.py 重新加载模型会导致 KeyError,2023-05-26.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/473,**问题描述 / Problem Description**,209
210,chatyuan无法使用,2023-05-26.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/475,**问题描述 / Problem Description**,210
211,[BUG] 文本分割模型AliTextSplitter存在bug会把“.”作为分割符,2023-05-26.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/476,"阿里达摩院的语义分割模型存在bug默认会把"".”作为分割符进行分割而不管上下文语义。是否还有其他分割符则未知。建议的修改方案:把“.”统一替换为其他字符,分割后再替换回来。或者添加其他分割模型。",211
212,[BUG] RuntimeError: Error in faiss::FileIOReader::FileIOReader(const char*) a,2023-05-27.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/479,**问题描述 / Problem Description**,212
213,[FEATURE] 安装为什么conda create要额外指定路径 用-p ,而不是默认的/envs下面,2023-05-28.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/481,##**功能描述 / Feature Description**,213
214,[小白求助] 通过Anaconda执行webui.py后无法打开web链接,2023-05-28.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/485,在执行webui.py命令后http://0.0.0.0:7860复制到浏览器后无法打开显示“无法访问此网站”。,214
215,[BUG] 使用 p-tuningv2后的模型重新加载报错,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/486,把p-tunningv2训练完后的相关文件放到了p-tunningv2文件夹下勾选使用p-tuningv2点重新加载模型控制台输错错误信息,215
216,[小白求助] 服务器上执行webui.py后在本地无法打开web链接,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/487,此项目执行在xxx.xx.xxx.xxx服务器上我在webui.py上的代码为 (demo,216
217,[FEATURE] 能不能支持VisualGLM-6B,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/488,**功能描述 / Feature Description**,217
218,你好问一下各位后端api部署的时候支持多用户同时问答吗,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/489,支持多用户的话,最多支持多少用户问答?根据硬件而定吧?,218
219,V100GPU显存占满而利用率却为0这是为什么,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/491,"<img width=""731"" alt=""de45fe2b6cb76fa091b6e8f76a3de60"" src=""https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/c32efd52-7dbf-4e9b-bd4d-0944d73d0b8b"">",219
220,[求助] 如果在公司内部搭建产品知识库使用INT-4模型200人规模需要配置多少显存的服务器,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/492,如题,计划给公司搭一个在线知识库。,220
221,你好请教个问题目前问答回复需要20秒左右如何提高速度V10032G服务器。,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/493,**问题描述 / Problem Description**,221
222,[FEATURE] 如何实现只匹配下文,而不要上文的结果,2023-05-29.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/494,在构建自己的知识库时主要采用问答对的形式那么也就是我需要的回答是在我的问题下面的内容但是目前设置了chunk_size的值以后匹配的是上下文的内容但我实际并不需要上文的。为了实现更完整的展示下面的答案我只能调大chunk_size的值但实际上上文的一半内容都是我不需要的。也就是扔了一半没用的东西给prompt在faiss.py中我也没找到这块的一些描述请问该如何进行修改呢,222
223,你好问一下我调用api.py部署为什么用ip加端口可以使用postman调用而改为域名使用postman无法调用,2023-05-30.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/497,![5ufBSWxLyF](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/70e2fbac-5699-48d0-b0d1-3dc84fd042c2),223
224,调用api.py中的stream_chat返回source_documents中出现中文乱码。,2023-05-30.04,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/498,-------------------------------------------------------------------------------,224
225,[BUG] 捉个虫api.py中的stream_chat解析json问题,2023-05-30.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/501,**问题描述 / Problem Description**,225
226,windows本地部署遇到了omp错误,2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/502,**问题描述 / Problem Description**,226
227,"[BUG] bug14 ,""POST /local_doc_qa/upload_file HTTP/1.1"" 422 Unprocessable Entity",2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/503,上传的文件报错返回错误api.py,227
228,你好请教个问题api.py部署的时候如何改为多线程调用谢谢,2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/505,目前的api.py脚本不支持多线程,228
229,你好请教一下。api.py部署的时候能不能提供给后端流失返回结果。,2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/507,curl -X 'POST' \,229
230,流式输出流式接口使用server-sent events技术。,2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/508,想这样一样https://blog.csdn.net/weixin_43228814/article/details/130063010,230
231,计划增加流式输出功能吗ChatGLM模型通过api方式调用响应时间慢怎么破Fastapi流式接口来解惑能快速提升响应速度,2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/509,**问题描述 / Problem Description**,231
232,[BUG] 知识库上传时发生ERROR (could not open xxx for reading: No such file or directory),2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/510,**问题描述 / Problem Description**,232
233,api.py脚本打算增加SSE流式输出吗,2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/511,curl调用的时候可以检测第一个字从而提升回复的体验,233
234,[BUG] 使用tornado实现webSocket可以多个客户端同时连接并且实现流式回复但是多个客户端同时使用答案就很乱是模型不支持多线程吗,2023-05-31.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/512,import asyncio,234
235,支持 chinese_alpaca_plus_lora 吗 基于llama的,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/514,支持 chinese_alpaca_plus_lora 吗 基于llama的https://github.com/ymcui/Chinese-LLaMA-Alpaca这个项目的,235
236,[BUG] 现在能读图片的pdf了但是文字的pdf反而读不了了什么情况,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/515,**问题描述 / Problem Description**,236
237,在推理的过程中卡住不动,进程无法正常结束,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/516,**问题描述 / Problem Description**,237
238,curl调用的时候从第二轮开始curl如何传参可以实现多轮对话,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/517,第一轮调用:,238
239,建议添加api.py部署后的日志管理功能,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/518,-------------------------------------------------------------------------------,239
240,有大佬知道怎么多线程部署api.py脚本吗,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/519,api.py部署后使用下面的请求时间较慢好像是单线程如何改为多线程部署api.py,240
241,[BUG] 上传文件到知识库 任何格式与内容都永远失败,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/520,上传知识库的时候传txt无法解析就算是穿content/sample里的样例txt也无法解析上传md、pdf等都无法加载会持续性等待等到了超过30分钟也不行。,241
242,关于prompt_template的问题,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/521,请问这段prompt_template是什么意思要怎么使用可以给一个具体模板参考下吗,242
243,[BUG] 简洁阐述问题 / Concise description of the issue,2023-06-01.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/522,**问题描述 / Problem Description**,243
244,"中文分词句号处理(关于表达金额之间的"".""",2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/523,建议处理12.6亿元的这样的分词最好别分成12 和6亿这样的需要放到一起,244
245,ImportError: cannot import name 'inference' from 'paddle',2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/526,在网上找了一圈有说升级paddle的我做了还是没有用有说安装paddlepaddle的我找了豆瓣的镜像源但安装报错cannot detect archive format,245
246,[BUG] webscoket 接口串行问题(/local_doc_qa/stream-chat/{knowledge_base_id},2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/527,**问题描述 / Problem Description**,246
247,[FEATURE] 刷新页面更新知识库列表,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/528,**功能描述以及改进方案**,247
248,[BUG] 使用ptuning微调模型后问答效果并不好,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/530,### 未调用ptuning,248
249,[BUG] 多轮对话效果不佳,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/532,在进行多轮对话的时候无论设置的history_len是多少效果都不好。事实上我将其设置成了最大值10但在对话中仍然无法实现多轮对话,249
250,"RuntimeError: MPS backend out of memory (MPS allocated: 18.00 GB, other allocations: 4.87 MB, max allowed: 18.13 GB)",2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/533,**问题描述**,250
251, 请大家重视这个issue真正使用肯定是多用户并发问答希望增加此功能,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/534,这得看你有多少显卡,251
252,在启动项目的时候如何使用到多张gpu啊,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/535,**在启动项目的时候如何使用到多张gpu啊**,252
253, 使用流式输出的时候curl调用的格式是什么,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/536,"app.websocket(""/local_doc_qa/stream-chat/{knowledge_base_id}"")(stream_chat)中的knowledge_base_id应该填什么",253
254,使用本地 vicuna-7b模型启动错误,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/538,环境: ubuntu 22.04 cuda 12.1 没有安装nccl使用rtx2080与m60显卡并行计算,254
255,为什么会不调用GPU直接调用CPU呢,2023-06-02.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/539,我的阿里云配置是16G显存用默认代码跑webui.py时提示,255
256,上传多个文件时会互相覆盖,2023-06-03.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/541,1、在同一个知识库中上传多个文件时会互相覆盖无法结合多个文档的知识有大佬知道怎么解决吗,256
257,[BUG] gcc不是内部或外部命令/LLM对话只能持续一轮,2023-06-03.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/542,No compiled kernel found.,257
258,以API模式启动项目却没有知识库的接口列表,2023-06-04.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/544,请问如何获取知识库的接口列表?如果没有需要自行编写的话,可不可以提供相关的获取方式,感谢,258
259,程序以API模式启动的时候如何才能让接口以stream模式被调用呢,2023-06-05.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/546,作者您好我在以API模式进行程序启动后我发现接口响应时间很长怎么样才能让接口以stream模式被调用呢我想实现像webui模式的回答那样,259
260,关于原文中表格转为文本后数据相关度问题。,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/547,原文中表格数据转换为文本,以 X-Y... 的格式每一行组织成一句话,但这样做后发现相关度较低,效果很差,有何好的方案吗?,260
261,启动后LLM和知识库问答模式均只有最后一轮记录,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/548,拉取最新代码,问答时,每次页面只显示最后一次问答记录,需要修改什么参数才可以保留历史记录?,261
262,提供system message配置以便于让回答不要超出知识库范围,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/549,**功能描述 / Feature Description**,262
263,[BUG] 使用p-tunningv2报错,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/551,按照readme的指示把p-tunningv2训练完后的文件放到了p-tunningv2文件夹下勾选使用p-tuningv2点重新加载模型控制台提示错误信息,263
264,[BUG] 智障,这么多问题,也好意思放出来,浪费时间,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/553,。。。,264
265,[FEATURE] 我看代码文件中有一个ali_text_splitter.py为什么不用他这个文本分割器了,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/554,我看代码文件中有一个ali_text_splitter.py为什么不用他这个文本分割器了,265
266,加载文档函数报错,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/557,"def load_file(filepath, sentence_size=SENTENCE_SIZE):",266
267,参考指引安装docker后运行cli_demo.py提示killed,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/558,root@b3d1bd08095c:/chatGLM# python3 cli_demo.py,267
268,注意:如果安装错误,注意这两个包的版本 wandb==0.11.0 protobuf==3.18.3,2023-06-06.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/559,Error1: 如果启动异常报错 `protobuf` 需要更新到 `protobuf==3.18.3 `,268
269,知识库对长文的知识相关度匹配不太理想有何优化方向,2023-06-07.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/563,我们可能录入一个文章有 1W 字,里面涉及这个文章主题的很多角度问题,我们针对他提问,他相关度匹配的内容和实际我们需要的答案相差很大怎么办。,269
270,使用stream-chat函数进行流式输出的时候能使用curl调用吗,2023-06-07.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/565,为什么下面这样调用会报错???,270
271,有大佬实践过 并行 或者 多线程 的部署方案吗?,2023-06-07.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/566,+1,271
272,多线程部署遇到问题?,2023-06-07.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/567,"<img width=""615"" alt=""3d87bf74f0cf1a4820cc9e46b245859"" src=""https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/8787570d-88bd-434e-aaa4-cb9276d1aa50"">",272
273,[BUG] 用fastchat加载vicuna-13b模型进行知识库的问答有token的限制错误,2023-06-07.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/569,当我开启fastchat的vicuna-13b的api服务然后config那里配置好(api本地测试过可以返回结果),然后知识库加载好之后(知识库大概有1000多个文档用chatGLM可以正常推理)进行问答时出现token超过限制就问了一句hello,273
274,现在的添加知识库,文件多了总是报错,也不知道自己加载了哪些文件,报错后也不知道是全部失败还是一部分成功;希望能有个加载指定文件夹作为知识库的功能,2023-06-07.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/574,**功能描述 / Feature Description**,274
275,[BUG] moss模型本地加载报错,2023-06-08.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/577,moss模型本地加载报错,275
276,加载本地moss模型报错Can't instantiate abstract class MOSSLLM with abstract methods _history_len,2023-06-08.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/578,(vicuna) ps@ps[13:56:20]:/data/chat/langchain-ChatGLM2/langchain-ChatGLM-0.1.13$ python webui.py --model-dir local_models --model moss --no-remote-model,276
277,[FEATURE] 能增加在前端页面控制prompt_template吗或是能支持前端页面选择使用哪个prompt,2023-06-08.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/579,目前只能在config里修改一个prompt想在多个不同场景切换比较麻烦,277
278,[BUG] streamlit ui的bug在增加知识库时会报错,2023-06-08.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/580,**问题描述 / Problem Description**,278
279,[FEATURE] webui/webui_st可以支持history吗目前仅能一次对话,2023-06-08.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/581,试了下webui和webui_st都不支持历史对话啊只能对话一次不能默认开启所有history吗,279
280,启动python cli_demo.py --model chatglm-6b-int4-qe报错,2023-06-09.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/585,下载好模型,和相关依赖环境,之间运行`python cli_demo.py --model chatglm-6b-int4-qe`报错了:,280
281,重新构建知识库报错,2023-06-09.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/586,**问题描述 / Problem Description**,281
282,[FEATURE] 能否屏蔽paddle我不需要OCR效果差依赖环境还很复杂,2023-06-09.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/587,希望能不依赖paddle,282
283,question :文档向量化这个可以自己手动实现么?,2023-06-09.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/589,现有公司级数据500G+,需要使用这个功能,请问如何手动实现这个向量化,然后并加载,283
284,view前端能进行流式的返回吗,2023-06-09.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/590,view前端能进行流式的返回吗,284
285,"[BUG] Load parallel cpu kernel failed, using default cpu kernel code",2023-06-11.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/594,**问题描述 / Problem Description**,285
286,[BUG] 简洁阐述问题 / Concise description of the issue,2023-06-11.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/595,**问题描述 / Problem Description**,286
287,我在上传本地知识库时提示KeyError: 'name'错误,本地知识库都是.txt文件文件数量大约是2000+。,2023-06-12.05,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/597,"<img width=""649"" alt=""KError"" src=""https://github.com/imClumsyPanda/langchain-ChatGLM/assets/59411575/1ecc8182-aeee-4a0a-bbc3-74c2f1373f2d"">",287
288,model_config.py中有vicuna-13b-hf模型的配置信息但是好像还是不可用,2023-06-12.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/600,@dongyihua543,288
289,"ImportError: Using SOCKS proxy, but the 'socksio' package is not installed. Make sure to install httpx using `pip install httpx[socks]`.",2023-06-12.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/605,应该代理问题,但是尝试了好多方法都解决不了,,289
290,[BUG] similarity_search_with_score_by_vector在找不到匹配的情况下出错,2023-06-12.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/607,在设置匹配阈值 VECTOR_SEARCH_SCORE_THRESHOLD 的情况下vectorstore会返回空此时上述处理函数会出错,290
291,[FEATURE] 请问如何搭建英文知识库呢,2023-06-12.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/609,**功能描述 / Feature Description**,291
292,谁有vicuna权重llama转换之后的,2023-06-13.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/611,**问题描述 / Problem Description**,292
293,[FEATURE] API能实现上传文件夹的功能么,2023-06-13.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/612,用户懒得全选所有的文件就想上传个文件夹请问下API能实现这个功能么,293
294,请问在多卡部署后,上传单个文件作为知识库,用的是单卡在生成向量还是多卡?,2023-06-13.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/614,目前我检测我本地多卡部署的,好像生成知识库向量的时候用的还是单卡,294
295,[BUG] python webui.py提示非法指令,2023-06-13.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/615,(/data/conda-langchain [root@chatglm langchain-ChatGLM]# python webui.py,295
296,知识库文件跨行切分问题,2023-06-13.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/616,我的知识库文件txt文件是一行一条知识用\n分行。,296
297,[FEATURE] bing搜索问答有流式的API么,2023-06-13.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/617,web端是有这个bing搜索回答但api接口没有发现大佬能给个提示么,297
298,希望出一个macos m2的安装教程,2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/620,mac m2安装模型加载成功了知识库文件也上传成功了但是一问答就会报错报错内容如下,298
299,为【出处】提供高亮显示,2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/621,具体出处里面,对相关的内容高亮显示,不包含前后文。,299
300,[BUG] CPU运行cli_demo.py不回答hang住,2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/622,没有GPU32G内存的ubuntu机器。,300
301,关于删除知识库里面的文档后LLM知识库对话的时候还是会返回该被删除文档的内容,2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/623,如题在vue前端成功执行删除知识库里面文档A.txt后未能也在faiss索引中也删除该文档LLM还是会返回这个A.txt的内容并且以A.txt为出处未能达到删除的效果,301
302,"[BUG] 调用知识库进行问答,显存会一直叠加",2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/625,"14G的显存,调用的chatglm-6b-int8模型,进行知识库问答时,最多问答四次就会爆显存了,观察了一下显存使用情况,每一次使用就会增加一次显存,请问这样是正常的吗?是否有什么配置需要开启可以解决这个问题?例如进行一次知识库问答清空上次问题的显存?",302
303,[BUG] web页面 重新构建数据库 失败,导致 原来的上传的数据库都没了,2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/626,web页面 重新构建数据库 失败,导致 原来的上传的数据库都没了,303
304,在CPU上运行webui.py报错Tensor on device cpu is not on the expected device meta!,2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/627,在CPU上运行python webui.py能启动但最后有RuntimeError: Tensor on device cpu is not on the expected device meta!,304
305,"OSError: [WinError 1114] 动态链接库(DLL)初始化例程失败。 Error loading ""E:\xxx\envs\langchain\lib\site-packages\torch\lib\caffe2_nvrtc.dll"" or one of its dependencies.哪位大佬知道如何解决吗?",2023-06-14.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/629,**问题描述 / Problem Description**,305
306,[BUG] WEBUI删除知识库文档会导致知识库问答失败,2023-06-15.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/632,如题,从知识库已有文件中选择要删除的文件,点击删除后,在问答框输入内容回车报错,306
307,更新后的版本中删除知识库中的文件再提问出现error错误,2023-06-15.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/634,针对更新版本,识别到一个问题,过程如下:,307
308,我配置好了环境,想要实现本地知识库的问答?可是它返回给我的,2023-06-15.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/637,没有总结,只有相关度的回复,但是我看演示里面表现的,回复是可以实现总结的,我去查询代码,308
309,[BUG] NPM run dev can not successfully start the VUE frontend,2023-06-15.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/638,**问题描述 / Problem Description**,309
310,[BUG] 简洁阐述问题 / Concise description of the issue,2023-06-15.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/639,**问题描述 / Problem Description**,310
311,提一个模型加载的bug我在截图中修复了你们有空可以看一下。,2023-06-15.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/642,![model_load_bug](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/59411575/4432adc4-ccdd-45d9-aafc-5f2d1963403b),311
312,[求助]关于设置embedding model路径的问题,2023-06-16.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/643,如题,我之前成功跑起来过一次,但因环境丢失重新配置 再运行webui就总是报错,312
313,Lora微调后的模型可以直接使用吗,2023-06-16.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/646,看model_config.py里是有USE_LORA这个参数的但是在cli_demo.py和webui.py这两个里面都没有用到实际测试下来模型没有微调的效果想问问现在这个功能实现了吗,313
314,write_check_file在tmp_files目录下生成的load_file.txt是否需要一直保留占用空间很大在建完索引后能否删除,2023-06-16.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/647,**功能描述 / Feature Description**,314
315,[BUG] /local_doc_qa/list_files?knowledge_base_id=test删除知识库bug,2023-06-16.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/649,1.新建test知识库并上传文件在vue前端完成并检查后端发现确实生成了test文件夹以及下面的content和vec_store,315
316,[BUG] vue webui无法加载知识库,2023-06-16.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/650,拉取了最新的代码分别运行了后端api和前端web点击知识库始终只能显示simple无法加载知识库,316
317,不能本地加载moss模型吗,2023-06-16.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/652,手动下载模型设置local_model_path路径依旧提示缺少文件该如何正确配置,317
318,macos m2 pro docker 安装失败,2023-06-17.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/654,macos m2 pro docker 安装失败,318
319, [BUG] mac m1 pro 运行提示 zsh: segmentation fault,2023-06-17.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/655,运行: python webui.py,319
320,安装 requirements 报错,2023-06-17.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/656,(langchainchatglm) D:\github\langchain-ChatGLM>pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple/,320
321,[BUG] AssertionError,2023-06-17.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/658,**问题描述 / Problem Description**,321
322,[FEATURE] 支持AMD win10 本地部署吗?,2023-06-18.06,https://github.com/imClumsyPanda/langchain-ChatGLM/issues/660,**功能描述 / Feature Description**,322
1 title file url detail id
2 0 效果如何优化 2023-04-04.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/14 如图所示,将该项目的README.md和该项目结合后,回答效果并不理想,请问可以从哪些方面进行优化 0
3 1 怎么让模型严格根据检索的数据进行回答,减少胡说八道的回答呢 2023-04-04.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/15 举个例子: 1
4 2 When I try to run the `python knowledge_based_chatglm.py`, I got this error in macOS(M1 Max, OS 13.2) 2023-04-07.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/32 ```python 2
5 3 萌新求教大佬怎么改成AMD显卡或者CPU? 2023-04-10.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/48 把.cuda()去掉就行 3
6 4 输出answer的时间很长,是否可以把文本向量化的部分提前做好存储起来? 2023-04-10.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/50 GPU:4090 24G显存 4
7 5 报错Use `repo_type` argument if needed. 2023-04-11.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/57 Traceback (most recent call last): 5
8 6 无法打开gradio的页面 2023-04-11.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/58 $ python webui.py 6
9 7 支持word,那word里面的图片正常显示吗? 2023-04-12.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/60 如题,刚刚从隔壁转过来的,想先了解下 7
10 8 detectron2 is not installed. Cannot use the hi_res partitioning strategy. Falling back to partitioning with the fast strategy. 2023-04-12.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/63 能够正常的跑起来,在加载content文件夹中的文件时,每加载一个文件都会提示: 8
11 9 cpu上运行webui,step3 asking时报错 2023-04-12.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/66 web运行,文件加载都正常,asking时报错 9
12 10 建议弄一个插件系统 2023-04-13.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/67 如题弄成stable-diffusion-webui那种能装插件,再开一个存储库给使用者或插件开发,存储或下载插件。 10
13 11 请教加载模型出错!? 2023-04-13.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/75 AttributeError: module 'transformers_modules.chatglm-6b.configuration_chatglm' has no attribute 'ChatGLMConfig 怎么解决呀 11
14 12 从本地知识检索内容的时候,是否可以设置相似度阈值,小于这个阈值的内容不返回,即使会小于设置的VECTOR_SEARCH_TOP_K参数呢?谢谢大佬 2023-04-13.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/76 比如 问一些 你好/你是谁 等一些跟本地知识库无关的问题 12
15 13 如何改成多卡推理? 2023-04-13.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/77 +1 13
16 14 能否弄个懒人包,可以一键体验? 2023-04-13.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/78 能否弄个懒人包,可以一键体验? 14
17 15 连续问问题会导致崩溃 2023-04-13.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/79 看上去不是爆内存的问题,连续问问题后,会出现如下报错 15
18 16 AttributeError: 'NoneType' object has no attribute 'as_retriever' 2023-04-14.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/86 环境:windows 11, anaconda/python 3.8 16
19 17 FileNotFoundError: Could not find module 'nvcuda.dll' (or one of its dependencies). Try using the full path with constructor syntax. 2023-04-14.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/87 请检查一下cuda或cudnn是否存在安装问题 17
20 18 加载txt文件失败? 2023-04-14.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/89 ![JppHrGOWFa](https://user-images.githubusercontent.com/109277248/232009383-bf7c46d1-a01e-4e0a-9de6-5b5ed3e36158.jpg) 18
21 19 NameError: name 'chatglm' is not defined 2023-04-14.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/90 This share link expires in 72 hours. For free permanent hosting and GPU upgrades (NEW!), check out Spaces: https://huggingface.co/spaces 19
22 20 打不开地址? 2023-04-14.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/91 报错数据如下: 20
23 21 加载md文件出错 2023-04-14.00 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/98 运行 webui.py后能访问页面,上传一个md文件后,日志中有错误。等待后能加载完成,提示可以提问了,但提问没反应,日志中有错误。 具体日志如下。 21
24 22 建议增加获取在线知识的能力 2023-04-15.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/101 建议增加获取在线知识的能力 22
25 23 txt 未能成功加载 2023-04-15.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/103 hinese. Creating a new one with MEAN pooling. 23
26 24 pdf加载失败 2023-04-15.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/105 e:\a.txt加载成功了,e:\a.pdf加载就失败,pdf文件里面前面几页是图片,后面都是文字,加载失败没有报更多错误,请问该怎么排查? 24
27 25 一直停在文本加载处 2023-04-15.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/108 一直停在文本加载处 25
28 26 File "/root/.cache/huggingface/modules/transformers_modules/chatglm-6b/modeling_chatglm.py", line 440, in forward new_tensor_shape = mixed_raw_layer.size()[:-1] + ( TypeError: torch.Size() takes an iterable of 'int' (item 2 is 'float') 2023-04-17.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/113 按照最新的代码,发现 26
29 27 后续会提供前后端分离的功能吗? 2023-04-17.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/114 类似这种https://github.com/lm-sys/FastChat/tree/main/fastchat/serve 27
30 28 安装依赖报错 2023-04-17.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/115 (test) C:\Users\linh\Desktop\langchain-ChatGLM-master>pip install -r requirements.txt 28
31 29 问特定问题会出现爆显存 2023-04-17.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/117 正常提问没问题。 29
32 30 Expecting value: line 1 column 1 (char 0) 2023-04-17.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/118 运行后 第一步加载配置一直报错: 30
33 31 embedding https://huggingface.co/GanymedeNil/text2vec-large-chinese/tree/main是免费的,效果比对openai的如何? 2023-04-17.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/119 ------------------------------------------------------------------------------- 31
34 32 这是什么错误,在Colab上运行的。 2023-04-17.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/120 libcuda.so.1: cannot open shared object file: No such file or directory 32
35 33 只想用自己的lora微调后的模型进行对话,不想加载任何本地文档,该如何调整? 2023-04-18.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/121 能出一个单独的教程吗 33
36 34 租的gpu,Running on local URL: http://0.0.0.0:7860 To create a public link, set `share=True` in `launch()`. 浏览器上访问不了??? 2023-04-18.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/122 (chatglm20230401) root@autodl-container-e82d11963c-10ece0d7:~/autodl-tmp/chatglm/langchain-ChatGLM-20230418# python3.9 webui.py 34
37 35 本地部署中的报错请教 2023-04-18.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/124 您好,在本地运行langchain-ChatGLM过程中,环境及依赖的包都已经满足条件,但是运行webui.py,报错如下(运行cli_demo.py报错类似),请问是哪里出了错呢?盼望您的回复,谢谢! 35
38 36 报错。The dtype of attention mask (torch.int64) is not bool 2023-04-18.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/131 The dtype of attention mask (torch.int64) is not bool 36
39 37 [求助] pip install -r requirements.txt 的时候出现以下报错。。。有大佬帮忙看看怎么搞么,下的release里面的包 2023-04-18.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/134 $ pip install -r requirements.txt 37
40 38 如何提升根据问题搜索到对应知识的准确率 2023-04-19.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/136 外链知识库最大的问题在于问题是短文本,知识是中长文本。如何根据问题精准的搜索到对应的知识是个最大的问题。这类本地化项目不像百度,由无数的网页,基本上每个问题都可以找到对应的页面。 38
41 39 是否可以增加向量召回的阈值设定,有些召回内容相关性太低,导致模型胡言乱语 2023-04-20.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/140 如题 39
42 40 输入长度问题 2023-04-20.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/141 感谢作者支持ptuning微调模型。 40
43 41 已有部署好的chatGLM-6b,如何通过接口接入? 2023-04-20.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/144 已有部署好的chatGLM-6b,如何通过接口接入,而不是重新加载一个模型; 41
44 42 执行web_demo.py后,显示Killed,就退出了,是不是配置不足呢? 2023-04-20.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/146 ![图片](https://user-images.githubusercontent.com/26102866/233256425-c7aab999-11d7-4de9-867b-23ef18d519e4.png) 42
45 43 执行python cli_demo1.py 2023-04-20.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/147 Traceback (most recent call last): 43
46 44 报错:ImportError: cannot import name 'GENERATION_CONFIG_NAME' from 'transformers.utils' 2023-04-20.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/149 (mychatGLM) PS D:\Users\admin3\zrh\langchain-ChatGLM> python cli_demo.py 44
47 45 上传文件并加载知识库时,会不停地出现临时文件 2023-04-21.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/153 环境:ubuntu 18.04 45
48 46 向知识库中添加文件后点击”上传文件并加载知识库“后Segmentation fault报错。 2023-04-23.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/161 运行服务后的提示如下: 46
49 47 langchain-serve 集成 2023-04-24.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/162 Hey 我是来自 [langchain-serve](https://github.com/jina-ai/langchain-serve) 的dev! 47
50 48 大佬们,wsl的ubuntu怎么配置用cuda加速,装了运行后发现是cpu在跑 2023-04-24.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/164 大佬们,wsl的ubuntu怎么配置用cuda加速,装了运行后发现是cpu在跑 48
51 49 在github codespaces docker运行出错 2023-04-24.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/165 docker run -d --restart=always --name chatglm -p 7860:7860 -v /www/wwwroot/code/langchain-ChatGLM:/chatGLM chatglm 49
52 50 有计划接入Moss模型嘛 2023-04-24.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/166 后续会开展测试,目前主要在优化langchain部分效果,如果有兴趣也欢迎提PR 50
53 51 怎么实现 API 部署? 2023-04-24.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/168 利用 fastapi 实现 API 部署方式,具体怎么实现,有方法说明吗? 51
54 52 'NoneType' object has no attribute 'message_types_by_name'报错 2023-04-24.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/169 _HISTOGRAMPROTO = DESCRIPTOR.message_types_by_name['HistogramProto'] 52
55 53 能否指定自己训练的text2vector模型? 2023-04-25.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/172 请问大佬: 53
56 54 关于项目支持的模型以及quantization_bit潜在的影响的问题 2023-04-26.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/176 作者您好~ 54
57 55 运行python3.9 api.py WARNING: You must pass the application as an import string to enable 'reload' or 'workers'. 2023-04-26.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/179 api.py文件最下面改成这样试试: 55
58 56 ValidationError: 1 validation error for HuggingFaceEmbeddings model_kwargs extra fields not permitted (type=value_error.extra) 2023-04-26.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/180 ValidationError: 1 validation error for HuggingFaceEmbeddings 56
59 57 如果没有检索到相关性比较高的,回答“我不知道” 2023-04-26.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/181 如果通过设计system_template,让模型在搜索到的文档都不太相关的情况下回答“我不知道” 57
60 58 请问如果不能联网,6B之类的文件从本地上传需要放到哪里 2023-04-26.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/182 感谢大佬的项目,很有启发~ 58
61 59 知识库问答--输入新的知识库名称是中文的话,会报error 2023-04-27.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/184 知识库问答--输入新的知识库名称是中文的话,会报error,选择要加载的知识库那里也不显示之前添加的知识库 59
62 60 现在能通过问题匹配的相似度值,来直接返回文档中的文段,而不经过模型吗?因为有些答案在文档中,模型自己回答,不能回答文档中的答案 2023-04-27.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/186 现在能通过问题匹配的相似度值,来直接返回文档中的文段,而不经过模型吗?因为有些答案在文档中,模型自己回答,不能回答文档中的答案。也就是说,提供向量检索回答+模型回答相结合的策略。如果相似度值高于一定数值,直接返回文档中的文本,没有高于就返回模型的回答或者不知道 60
63 61 TypeError: The type of ChatGLM.callback_manager differs from the new default value; if you wish to change the type of this field, please use a type annotation 2023-04-27.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/188 Mac 运行 python3 ./webui.py 报 TypeError: The type of ChatGLM.callback_manager differs from the new default value; if you wish to change the type of this field, please use a type annotation 61
64 62 Not Enough Memory 2023-04-27.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/190 运行命令行程序python cli_demo.py, 已经成功加载pdf文件, 报“DefaultCPUAllocator: not enough memory: you tried to allocate 458288380900 bytes”错误,请问哪里可以配置default memory 62
65 63 参与开发问题 2023-04-27.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/191 1.是否需要进专门的开发群 63
66 64 对话框中代码片段格式需改进 2023-04-27.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/192 最好能改进下输出代码片段的格式,目前输出的格式还不友好。 64
67 65 请问未来有可能支持belle吗 2023-04-28.01 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/195 如题,谢谢大佬 65
68 66 TypeError: cannot unpack non-iterable NoneType object 2023-04-28.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/200 When i tried to change the knowledge vector store through `init_knowledge_vector_store`, the error `TypeError: cannot unpack non-iterable NoneType object` came out. 66
69 67 生成结果 2023-04-28.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/202 你好,想问一下langchain+chatglm-6B,找到相似匹配的prompt,是直接返回prompt对应的答案信息,还是chatglm-6B在此基础上自己优化答案? 67
70 68 在win、ubuntu下都出现这个错误:attributeerror: 't5forconditionalgeneration' object has no attribute 'stream_chat' 2023-04-29.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/207 在win、ubuntu。下载完模型后,没办法修改代码以执行本地模型,每次都要重新输入路径; LLM 模型、Embedding 模型支持也都在官网下的,在其他项目(wenda)下可以使用 68
71 69 [FEATURE] knowledge_based_chatglm.py: renamed or missing? 2023-04-30.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/210 Not found. Was it renamed? Or, is it missing? How can I get it? 69
72 70 sudo apt-get install -y nvidia-container-toolkit-base执行报错 2023-05-01.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/211 **问题描述 / Problem Description** 70
73 71 效果不佳几乎答不上来 2023-05-01.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/212 提供了50条问答的docx文件 71
74 72 有没有可能新增一个基于chatglm api调用的方式构建langchain 2023-05-02.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/218 我有两台8G GPU/40G内存的服务器,一个台做成了chatglm的api ;想基于另外一台服务器部署langchain;网上好像没有类似的代码。 72
75 73 电脑是intel的集成显卡; 运行时告知我找不到nvcuda.dll,模型无法运行 2023-05-02.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/219 您好,我的电脑是intel的集成显卡,不过CPU是i5-11400 @ 2.60GHz ,内存64G; 73
76 74 根据langchain官方的文档和使用模式,是否可以改Faiss为Elasticsearch?会需要做哪些额外调整?求解 2023-05-03.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/221 本人新手小白,由于业务模式的原因(有一些自己的场景和优化),希望利用Elasticsearch做这个体系内部的检索机制,不知道是否可以替换,同时,还会涉及到哪些地方的改动?或者说可能会有哪些其他影响,希望作者和大佬们不吝赐教! 74
77 75 请问未来有可能支持t5吗 2023-05-04.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/224 请问可能支持基於t5的模型吗? 75
78 76 [BUG] 内存溢出 / torch.cuda.OutOfMemoryError: 2023-05-04.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/229 **问题描述 / Problem Description** 76
79 77 报错 No module named 'chatglm_llm' 2023-05-04.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/230 明明已经安装了包,却在python里吊不出来 77
80 78 能出一个api部署的描述文档吗 2023-05-04.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/233 **功能描述 / Feature Description** 78
81 79 使用docs/API.md 出错 2023-05-04.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/234 使用API.md文档2种方法,出错 79
82 80 加载pdf文档报错? 2023-05-05.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/238 ew one with MEAN pooling. 80
83 81 上传的本地知识文件后再次上传不能显示,只显示成功了一个,别的上传成功后再次刷新就没了 2023-05-05.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/239 您好,项目有很大启发,感谢~ 81
84 82 创建了新的虚拟环境,安装了相关包,并且自动下载了相关的模型,但是仍旧出现:OSError: Unable to load weights from pytorch checkpoint file for 2023-05-05.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/240 ![78ac8e663fdc312d0e9d78da95925c4](https://user-images.githubusercontent.com/34124260/236378728-9ea4424f-0f7f-4013-9d33-820b723de321.png) 82
85 83 [BUG] 数据加载不进来 2023-05-05.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/243 使用的.txt格式,utf-8编码,报以下错误 83
86 84 不能读取pdf 2023-05-05.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/244 请问是webui还是cli_demo 84
87 85 本地txt文件有500M,加载的时候很慢,如何提高速度? 2023-05-06.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/251 ![yayRzxSYHP](https://user-images.githubusercontent.com/109277248/236592902-f5ab338d-c1e9-43dc-ae16-9df2cd3c1378.jpg) 85
88 86 [BUG] gradio上传知识库后刷新之后 知识库就不见了 只有重启才能看到之前的上传的知识库 2023-05-06.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/253 gradio上传知识库后刷新之后 知识库就不见了 只有重启才能看到之前的上传的知识库 86
89 87 [FEATURE] 可以支持 OpenAI 的模型嘛?比如 GPT-3、GPT-3.5、GPT-4;embedding 增加 text-embedding-ada-002 2023-05-06.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/254 **功能描述 / Feature Description** 87
90 88 [FEATURE] 能否增加对于milvus向量数据库的支持 / Concise description of the feature 2023-05-06.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/256 **功能描述 / Feature Description** 88
91 89 CPU和GPU上跑,除了速度有区别,准确率效果回答上有区别吗? 2023-05-06.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/259 理论上没有区别 89
92 90 m1,请问在生成回答时怎么看是否使用了mps or cpu? 2023-05-06.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/260 m1,请问在生成回答时怎么看是否使用了mps or cpu? 90
93 91 知识库一刷新就没了 2023-05-07.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/263 知识库上传后刷新就没了 91
94 92 本地部署报没有模型 2023-05-07.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/267 建议在下载llm和embedding模型至本地后在configs/model_config中写入模型本地存储路径后再运行 92
95 93 [BUG] python3: can't open file 'webui.py': [Errno 2] No such file or directory 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/269 **问题描述 / Problem Description** 93
96 94 模块缺失提示 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/271 因为已有自己使用的docker环境,直接启动webui.py,提示 94
97 95 运行api.py后,执行curl -X POST "http://127.0.0.1:7861" 报错? 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/272 执行curl -X POST "http://127.0.0.1:7861" \ -H 'Content-Type: application/json' \ -d '{"prompt": "你好", "history": []}',报错怎么解决 95
98 96 [BUG] colab安装requirements提示protobuf版本问题? 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/273 pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 96
99 97 请问项目里面向量相似度使用了什么方法计算呀? 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/275 基本按照langchain里的FAISS.similarity_search_with_score_by_vector实现 97
100 98 [BUG] 安装detectron2后,pdf无法加载 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/276 **问题描述 / Problem Description** 98
101 99 [BUG] 使用ChatYuan-V2模型无法流式输出,会报错 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/277 一方面好像是ChatYuan本身不支持stream_chat,有人在clueai那边提了issue他们说还没开发,所以估计这个attribute调不起来;但是另一方面看报错好像是T5模型本身就不是decoder-only模型,所以不能流式输出吧(个人理解) 99
102 100 [BUG] 无法加载text2vec模型 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/278 **问题描述 / Problem Description** 100
103 101 请问能否增加网络搜索功能 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/281 请问能否增加网络搜索功能 101
104 102 [FEATURE] 结构化数据sql、excel、csv啥时会支持呐。 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/283 **功能描述 / Feature Description** 102
105 103 TypeError: ChatGLM._call() got an unexpected keyword argument 'stop' 2023-05-08.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/284 No sentence-transformers model found with name D:\DevProject\langchain-ChatGLM\GanymedeNil\text2vec-large-chinese. Creating a new one with MEAN pooling. 103
106 104 关于api.py的一些bug和设计逻辑问题? 2023-05-09.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/285 首先冒昧的问一下,这个api.py,开发者大佬们是在自己电脑上测试后确实没问题吗? 104
107 105 有没有租用的算力平台上,运行api.py后,浏览器http://localhost:7861/报错 2023-05-09.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/287 是不是租用的gpu平台上都会出现这个问题??? 105
108 106 请问一下项目中有用到文档段落切割方法吗? 2023-05-09.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/288 text_load中的文档切割方法用上了吗?在代码中看好像没有用到? 106
109 107 报错 raise ValueError(f"Knowledge base {knowledge_base_id} not found") ValueError: Knowledge base ./vector_store not found 2023-05-09.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/289 File "/root/autodl-tmp/chatglm/langchain-ChatGLM-master/api.py", line 183, in chat 107
110 108 能接入vicuna模型吗 2023-05-09.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/290 目前本地已经有了vicuna模型能直接接入吗? 108
111 109 [BUG] 提问公式相关问题大概率爆显存 2023-05-09.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/291 **问题描述 / Problem Description** 109
112 110 安装pycocotools失败,找了好多方法都不能解决。 2023-05-10.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/292 **问题描述 / Problem Description** 110
113 111 使用requirements安装,PyTorch安装的是CPU版本 2023-05-10.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/294 如题目,使用requirements安装,PyTorch安装的是CPU版本,运行程序的时候,也是使用CPU在工作。 111
114 112 能不能给一个毛坯服务器的部署教程 2023-05-10.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/298 “开发部署”你当成服务器的部署教程用就行了。 112
115 113 Error(s) in loading state_dict for ChatGLMForConditionalGeneration: 2023-05-10.02 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/299 运行中出现的问题,7860的端口页面显示不出来,求助。 113
116 114 ChatYuan-large-v2模型加载失败 2023-05-10.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/300 **实际结果 / Actual Result** 114
117 115 新增摘要功能 2023-05-10.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/303 你好,后续会考虑新增对长文本信息进行推理和语音理解功能吗?比如生成摘要 115
118 116 [BUG] pip install -r requirements.txt 出错 2023-05-10.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/304 pip install langchain -i https://pypi.org/simple 116
119 117 [BUG] 上传知识库文件报错 2023-05-10.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/305 ![19621e29eaa547d01213bee53d81e6a](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/84606552/7f6ceb46-e494-4b0e-939c-23b585a6d9d8) 117
120 118 [BUG] AssertionError: <class 'gradio.layouts.Accordion'> Component with id 41 not a valid input component. 2023-05-10.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/306 **问题描述 / Problem Description** 118
121 119 [BUG] CUDA out of memory with container deployment 2023-05-10.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/310 **问题描述 / Problem Description** 119
122 120 [FEATURE] 增加微调训练功能 2023-05-11.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/311 **功能描述 / Feature Description** 120
123 121 如何使用多卡部署,多个gpu 2023-05-11.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/315 机器上有多个gpu,如何全使用了 121
124 122 请问这个知识库问答,和chatglm的关系是什么 2023-05-11.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/319 这个知识库问答,哪部分关联到了chatglm,是不是没有这个chatglm,知识库问答也可单单拎出来 122
125 123 [BUG] 运行的时候报错ImportError: libcudnn.so.8: cannot open shared object file: No such file or directory 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/324 **问题描述 / Problem Description**raceback (most recent call last): 123
126 124 webui启动成功,但有报错 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/325 **问题描述 / Problem Description** 124
127 125 切换MOSS的时候报错 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/327 danshi但是发布的源码中, 125
128 126 vicuna模型是否能接入? 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/328 您好!关于MOSS模型和vicuna模型,都是AutoModelForCausalLM来加载模型的,但是稍作更改(模型路径这些)会报这个错误。这个错误的造成是什么 126
129 127 你好,请问一下在阿里云CPU服务器上跑可以吗?可以的话比较理想的cpu配置是什么? 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/330 你好,请问一下在阿里云CPU服务器上跑可以吗?可以的话比较理想的cpu配置是什么? 127
130 128 你好,请问8核32g的CPU可以跑多轮对话吗? 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/331 什么样的cpu配置比较好呢?我目前想部署CPU下的多轮对话? 128
131 129 [BUG] 聊天内容输入超过10000个字符系统出现错误 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/332 聊天内容输入超过10000个字符系统出现错误,如下图所示: 129
132 130 能增加API的多用户访问接口部署吗? 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/333 默认部署程序仅支持单用户访问,多用户则需要排队访问。测试过相关的几个Github多用户工程,但是其中一些仍然不满足要求。本节将系统介绍如何实现多用户同时访问ChatGLM的部署接口,包括http、websocket(流式输出,stream)和web页面等方式,主要目录如下所示。 130
133 131 多卡部署 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/334 用单机多卡或多机多卡,fastapi部署模型,怎样提高并发 131
134 132 WEBUI能否指定知识库目录? 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/335 **功能描述 / Feature Description** 132
135 133 [BUG] Cannot read properties of undefined (reading 'error') 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/336 **问题描述 / Problem Description** 133
136 134 [BUG] 1 validation error for HuggingFaceEmbeddings model_kwargs extra fields not permitted. 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/337 模型加载到 100% 后出现问题: 134
137 135 上传知识库需要重启能不能修复一下 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/338 挺严重的这个问题 135
138 136 [BUG] 4块v100卡爆显存,在LLM会话模式也一样 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/339 **问题描述 / Problem Description** 136
139 137 针对上传的文件配置不同的TextSpliter 2023-05-12.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/341 1. 目前的ChineseTextSpliter切分对英文尤其是代码文件不友好,而且限制固定长度;导致对话结果不如人意 137
140 138 [FEATURE] 未来可增加Bloom系列模型吗?根据甲骨易的测试,这系列中文评测效果不错 2023-05-13.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/346 **功能描述 / Feature Description** 138
141 139 [BUG] v0.1.12打包镜像后启动webui.py失败 / Concise description of the issue 2023-05-13.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/347 **问题描述 / Problem Description** 139
142 140 切换MOSS模型时报错 2023-05-13.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/349 昨天问了下,说是transformers版本不对,需要4.30.0,发现没有这个版本,今天更新到4.29.1,依旧报错,错误如下 140
143 141 [BUG] pdf文档加载失败 2023-05-13.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/350 **问题描述 / Problem Description** 141
144 142 建议可以在后期增强一波注释,这样也有助于更多人跟进提PR 2023-05-13.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/351 知道作者和团队在疯狂更新审查代码,只是建议后续稳定后可以把核心代码进行一些注释的补充,从而能帮助更多人了解各个模块作者的思路从而提出更好的优化。 142
145 143 [FEATURE] MOSS 量化版支援 2023-05-13.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/353 **功能描述 / Feature Description** 143
146 144 [BUG] moss模型无法加载 2023-05-13.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/356 **问题描述 / Problem Description** 144
147 145 [BUG] load_doc_qa.py 中的 load_file 函数有bug 2023-05-14.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/358 原函数为: 145
148 146 [FEATURE] API模式,知识库加载优化 2023-05-14.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/359 如题,当前版本,每次调用本地知识库接口,都将加载一次知识库,是否有更好的方式? 146
149 147 运行Python api.py脚本后端部署后,怎么使用curl命令调用? 2023-05-15.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/361 也就是说,我现在想做个对话机器人,想和公司的前后端联调?怎么与前后端相互调用呢?可私信,有偿解答!!! 147
150 148 上传知识库需要重启能不能修复一下 2023-05-15.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/363 上传知识库需要重启能不能修复一下 148
151 149 [BUG] pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple 2023-05-15.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/364 我的python是3.8.5的 149
152 150 pip install gradio 报错 2023-05-15.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/367 大佬帮我一下 150
153 151 [BUG] pip install gradio 一直卡不动 2023-05-15.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/369 ![aba82742dd9d4d242181662eb5027a7](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/84606552/cd9600d9-f6e7-46b7-b1be-30ed8b99f76b) 151
154 152 [BUG] 简洁阐述问题 / Concise description of the issue 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/370 初次加载本地知识库成功,但提问后,就无法重写加载本地知识库 152
155 153 [FEATURE] 简洁阐述功能 / Concise description of the feature 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/371 **功能描述 / Feature Description** 153
156 154 在windows上,模型文件默认会安装到哪 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/372 ------------------------------------------------------------------------------- 154
157 155 [FEATURE] 兼顾对话管理 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/374 如何在知识库检索的情况下,兼顾对话管理? 155
158 156 llm device: cpu embedding device: cpu 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/376 **问题描述 / Problem Description** 156
159 157 [FEATURE] 简洁阐述功能 /文本文件的知识点之间使用什么分隔符可以分割? 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/377 **功能描述 / Feature Description** 157
160 158 [BUG] 上传文件失败:PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/379 **问题描述 / Problem Description** 158
161 159 [BUG] 执行python api.py 报错 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/383 错误信息 159
162 160 model_kwargs extra fields not permitted (type=value_error.extra) 2023-05-16.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/384 大家好,请问这个有遇到的么,? 160
163 161 [BUG] 简洁阐述问题 / Concise description of the issue 2023-05-17.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/385 执行的时候出现了ls1 = [ls[0]] 161
164 162 [FEATURE] 性能优化 2023-05-17.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/388 **功能描述 / Feature Description** 162
165 163 [BUG] Moss模型问答,RuntimeError: probability tensor contains either inf, nan or element < 0 2023-05-17.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/390 **问题描述 / Problem Description** 163
166 164 有没有人知道v100GPU的32G显存,会报错吗?支持V100GPU吗? 2023-05-17.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/392 **问题描述 / Problem Description** 164
167 165 针对于编码问题比如'gbk' codec can't encode character '\xab' in position 14: illegal multibyte sequence粗浅的解决方法 2023-05-17.03 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/397 **功能描述 / Feature Description** 165
168 166 Could not import sentence_transformers python package. Please install it with `pip install sentence_transformers`. 2023-05-18.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/400 **问题描述 / Problem Description** 166
169 167 支持模型问答与检索问答 2023-05-18.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/401 不同的query,根据意图不一致,回答也应该不一样。 167
170 168 文本分割的时候,能不能按照txt文件的每行进行分割,也就是按照换行符号\n进行分割??? 2023-05-18.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/403 下面的代码应该怎么修改? 168
171 169 local_doc_qa/local_doc_chat 接口响应是串行 2023-05-18.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/405 **问题描述 / Problem Description** 169
172 170 为什么找到出处了,但是还是无法回答该问题? 2023-05-18.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/406 ![图片](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/3349611/1fc81d61-2409-4330-9065-fdda1a27c86a) 170
173 171 请问下:知识库测试中的:添加单条内容,如果换成文本导入是是怎样的格式?我发现添加单条内容测试效果很好. 2023-05-18.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/412 我发现在知识库测试中`添加单条内容`,并且勾选`禁止内容分句入库`,即使 `不开启上下文关联`的测试效果都非常好. 171
174 172 [BUG] 无法配置知识库 2023-05-18.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/413 **问题描述 / Problem Description** 172
175 173 [BUG] 部署在阿里PAI平台的EAS上访问页面是白屏 2023-05-19.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/414 **问题描述 / Problem Description** 173
176 174 API部署后调用/local_doc_qa/local_doc_chat 返回Knowledge base samples not found 2023-05-19.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/416 入参 174
177 175 [FEATURE] 上传word另存为的txt文件报 'ascii' codec can't decode byte 0xb9 in position 6: ordinal not in range(128) 2023-05-20.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/421 上传word另存为的txt文件报 175
178 176 创建保存的知识库刷新后没有出来,这个知识库是永久保存的吗?可以连外部的 向量知识库吗? 2023-05-21.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/422 创建保存的知识库刷新后没有出来,这个知识库是永久保存的吗?可以连外部的 向量知识库吗? 176
179 177 [BUG] 用colab运行,无法加载模型,报错:'NoneType' object has no attribute 'message_types_by_name' 2023-05-21.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/423 **问题描述 / Problem Description** 177
180 178 请问是否需要用到向量数据库?以及什么时候需要用到向量数据库? 2023-05-21.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/424 目前用的是 text2vec , 请问是否需要用到向量数据库?以及什么时候需要用到向量数据库? 178
181 179 huggingface模型引用问题 2023-05-22.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/427 它最近似乎变成了一个Error? 179
182 180 你好,加载本地txt文件出现这个killed错误,TXT文件有100M左右大小。原因是?谢谢。 2023-05-22.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/429 <img width="677" alt="929aca3b22b8cd74e997a87b61d241b" src="https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/24024522-c884-4170-b5cf-a498491bd8bc"> 180
183 181 想请问一下,关于对本地知识的管理是如何管理?例如:通过http API接口添加数据 或者 删除某条数据 2023-05-22.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/430 例如:通过http API接口添加、删除、修改 某条数据。 181
184 182 [FEATURE] 双栏pdf识别问题 2023-05-22.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/432 试了一下模型,感觉对单栏pdf识别的准确性较高,但是由于使用的基本是ocr的技术,对一些双栏pdf论文识别出来有很多问题,请问有什么办法改善吗? 182
185 183 部署启动小问题,小弟初学求大佬解答 2023-05-22.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/433 1.python loader/image_loader.py时,提示ModuleNotFoundError: No module named 'configs',但是跑python webui.py还是还能跑 183
186 184 能否支持检测到目录下文档有增加而去增量加载文档,不影响前台对话,其实就是支持读写分离。如果能支持查询哪些文档向量化了,删除过时文档等就更好了,谢谢。 2023-05-22.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/434 **功能描述 / Feature Description** 184
187 185 [BUG] 简洁阐述问题 / windows 下cuda错误,请用https://github.com/Keith-Hon/bitsandbytes-windows.git 2023-05-22.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/435 pip install git+https://github.com/Keith-Hon/bitsandbytes-windows.git 185
188 186 [BUG] from commit 33bbb47, Required library version not found: libbitsandbytes_cuda121_nocublaslt.so. Maybe you need to compile it from source? 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/438 **问题描述 / Problem Description** 186
189 187 [BUG] 简洁阐述问题 / Concise description of the issue上传60m的txt文件报错,显示超时,请问这个能上传的文件大小有限制吗 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/439 ERROR 2023-05-23 11:13:09,627-1d: Timeout reached while detecting encoding for ./docs/GLM模型格式数据.txt 187
190 188 [BUG] TypeError: issubclass() arg 1 must be a class 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/440 **问题描述** 188
191 189 执行python3 webui.py后,一直提示”模型未成功加载,请到页面左上角"模型配置"选项卡中重新选择后点击"加载模型"按钮“ 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/441 **问题描述 / Problem Description** 189
192 190 是否能提供网页文档得导入支持 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/444 现在很多都是在线文档作为协作得工具,所以通过URL导入在线文档需求更大 190
193 191 [BUG] history 索引问题 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/445 在比较对话框的history和模型chat function 中的history时, 发现并不匹配,在传入 llm._call 时,history用的索引是不是有点问题,导致上一轮对话的内容并不输入给模型。 191
194 192 [BUG] moss_llm没有实现 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/447 有些方法没支持,如history_len 192
195 193 请问langchain-ChatGLM如何删除一条本地知识库的数据? 2023-05-23.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/448 例如:用户刚刚提交了一条错误的数据到本地知识库中了,现在如何在本地知识库从找到,并且对此删除。 193
196 194 [BUG] 简洁阐述问题 / UnboundLocalError: local variable 'resp' referenced before assignment 2023-05-24.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/450 在最新一版的代码中, 运行api.py 出现了以上错误(UnboundLocalError: local variable 'resp' referenced before assignment), 通过debug的方式观察到local_doc_qa.llm.generatorAnswer(prompt=question, history=history,streaming=True)可能不返回任何值。 194
197 195 请问有没有 PROMPT_TEMPLATE 能让模型不回答敏感问题 2023-05-24.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/452 ## PROMPT_TEMPLATE问题 195
198 196 [BUG] 测试环境 Python 版本有误 2023-05-24.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/456 **问题描述 / Problem Description** 196
199 197 [BUG] webui 部署后样式不正确 2023-05-24.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/458 **问题描述 / Problem Description** 197
200 198 配置默认LLM模型的问题 2023-05-24.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/459 **问题描述 / Problem Description** 198
201 199 [FEATURE]是时候更新一下autoDL的镜像了 2023-05-24.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/460 如题,跑了下autoDL的镜像,发现是4.27号的,git pull新版本的代码功能+老的依赖环境,各种奇奇怪怪的问题。 199
202 200 [BUG] tag:0.1.13 以cpu模式下,想使用本地模型无法跑起来,各种路径参数问题 2023-05-24.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/462 ------------------------------------------------------------------------------- 200
203 201 [BUG] 有没有同学遇到过这个错!!!加载本地txt文件出现这个killed错误,TXT文件有100M左右大小。 2023-05-25.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/463 运行cli_demo.py。是本地的txt文件太大了吗?100M左右。 201
204 202 API版本能否提供WEBSOCKET的流式接口 2023-05-25.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/464 webui 版本中,采用了WS的流式输出,整体感知反应很快 202
205 203 [BUG] 安装bug记录 2023-05-25.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/465 按照[install文档](https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/docs/INSTALL.md)安装的, 203
206 204 VUE的pnmp i执行失败的修复-用npm i命令即可 2023-05-25.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/466 感谢作者!非常棒的应用,用的很开心。 204
207 205 请教个问题,有没有人知道cuda11.4是否支持??? 2023-05-25.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/467 请教个问题,有没有人知道cuda11.4是否支持??? 205
208 206 请问有实现多轮问答中基于问题的搜索上下文关联么 2023-05-25.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/468 在基于知识库的多轮问答中,第一个问题讲述了一个主题,后续的问题描述没有包含这个主题的关键词,但又存在上下文的关联。如果用后续问题去搜索知识库有可能会搜索出无关的信息,从而导致大模型无法正确回答问题。请问这个项目要考虑这种情况吗? 206
209 207 [BUG] 内存不足的问题 2023-05-26.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/470 我用了本地的chatglm-6b-int4模型,然后显示了内存不足(win10+32G内存+1080ti11G),一般需要多少内存才足够?这个bug应该如何解决? 207
210 208 [BUG] 纯内网环境安装pycocotools失败 2023-05-26.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/472 **问题描述 / Problem Description** 208
211 209 [BUG] webui.py 重新加载模型会导致 KeyError 2023-05-26.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/473 **问题描述 / Problem Description** 209
212 210 chatyuan无法使用 2023-05-26.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/475 **问题描述 / Problem Description** 210
213 211 [BUG] 文本分割模型AliTextSplitter存在bug,会把“.”作为分割符 2023-05-26.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/476 阿里达摩院的语义分割模型存在bug,默认会把".”作为分割符进行分割而不管上下文语义。是否还有其他分割符则未知。建议的修改方案:把“.”统一替换为其他字符,分割后再替换回来。或者添加其他分割模型。 211
214 212 [BUG] RuntimeError: Error in faiss::FileIOReader::FileIOReader(const char*) a 2023-05-27.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/479 **问题描述 / Problem Description** 212
215 213 [FEATURE] 安装,为什么conda create要额外指定路径 用-p ,而不是默认的/envs下面 2023-05-28.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/481 ##**功能描述 / Feature Description** 213
216 214 [小白求助] 通过Anaconda执行webui.py后,无法打开web链接 2023-05-28.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/485 在执行webui.py命令后,http://0.0.0.0:7860复制到浏览器后无法打开,显示“无法访问此网站”。 214
217 215 [BUG] 使用 p-tuningv2后的模型,重新加载报错 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/486 把p-tunningv2训练完后的相关文件放到了p-tunningv2文件夹下,勾选使用p-tuningv2点重新加载模型,控制台输错错误信息: 215
218 216 [小白求助] 服务器上执行webui.py后,在本地无法打开web链接 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/487 此项目执行在xxx.xx.xxx.xxx服务器上,我在webui.py上的代码为 (demo 216
219 217 [FEATURE] 能不能支持VisualGLM-6B 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/488 **功能描述 / Feature Description** 217
220 218 你好,问一下各位,后端api部署的时候,支持多用户同时问答吗??? 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/489 支持多用户的话,最多支持多少用户问答?根据硬件而定吧? 218
221 219 V100GPU显存占满,而利用率却为0,这是为什么? 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/491 <img width="731" alt="de45fe2b6cb76fa091b6e8f76a3de60" src="https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/c32efd52-7dbf-4e9b-bd4d-0944d73d0b8b"> 219
222 220 [求助] 如果在公司内部搭建产品知识库,使用INT-4模型,200人规模需要配置多少显存的服务器? 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/492 如题,计划给公司搭一个在线知识库。 220
223 221 你好,请教个问题,目前问答回复需要20秒左右,如何提高速度?V10032G服务器。 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/493 **问题描述 / Problem Description** 221
224 222 [FEATURE] 如何实现只匹配下文,而不要上文的结果 2023-05-29.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/494 在构建自己的知识库时,主要采用问答对的形式,那么也就是我需要的回答是在我的问题下面的内容,但是目前设置了chunk_size的值以后匹配的是上下文的内容,但我实际并不需要上文的。为了实现更完整的展示下面的答案,我只能调大chunk_size的值,但实际上上文的一半内容都是我不需要的。也就是扔了一半没用的东西给prompt,在faiss.py中我也没找到这块的一些描述,请问该如何进行修改呢? 222
225 223 你好,问一下,我调用api.py部署,为什么用ip加端口可以使用postman调用,而改为域名使用postman无法调用? 2023-05-30.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/497 ![5ufBSWxLyF](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/70e2fbac-5699-48d0-b0d1-3dc84fd042c2) 223
226 224 调用api.py中的stream_chat,返回source_documents中出现中文乱码。 2023-05-30.04 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/498 ------------------------------------------------------------------------------- 224
227 225 [BUG] 捉个虫,api.py中的stream_chat解析json问题 2023-05-30.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/501 **问题描述 / Problem Description** 225
228 226 windows本地部署遇到了omp错误 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/502 **问题描述 / Problem Description** 226
229 227 [BUG] bug14 ,"POST /local_doc_qa/upload_file HTTP/1.1" 422 Unprocessable Entity 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/503 上传的文件报错,返回错误,api.py 227
230 228 你好,请教个问题,api.py部署的时候,如何改为多线程调用?谢谢 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/505 目前的api.py脚本不支持多线程 228
231 229 你好,请教一下。api.py部署的时候,能不能提供给后端流失返回结果。 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/507 curl -X 'POST' \ 229
232 230 流式输出,流式接口,使用server-sent events技术。 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/508 想这样一样,https://blog.csdn.net/weixin_43228814/article/details/130063010 230
233 231 计划增加流式输出功能吗?ChatGLM模型通过api方式调用响应时间慢怎么破,Fastapi流式接口来解惑,能快速提升响应速度 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/509 **问题描述 / Problem Description** 231
234 232 [BUG] 知识库上传时发生ERROR (could not open xxx for reading: No such file or directory) 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/510 **问题描述 / Problem Description** 232
235 233 api.py脚本打算增加SSE流式输出吗? 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/511 curl调用的时候可以检测第一个字,从而提升回复的体验 233
236 234 [BUG] 使用tornado实现webSocket,可以多个客户端同时连接,并且实现流式回复,但是多个客户端同时使用,答案就很乱,是模型不支持多线程吗 2023-05-31.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/512 import asyncio 234
237 235 支持 chinese_alpaca_plus_lora 吗 基于llama的 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/514 支持 chinese_alpaca_plus_lora 吗 基于llama的,https://github.com/ymcui/Chinese-LLaMA-Alpaca这个项目的 235
238 236 [BUG] 现在能读图片的pdf了,但是文字的pdf反而读不了了,什么情况??? 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/515 **问题描述 / Problem Description** 236
239 237 在推理的过程中卡住不动,进程无法正常结束 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/516 **问题描述 / Problem Description** 237
240 238 curl调用的时候,从第二轮开始,curl如何传参可以实现多轮对话? 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/517 第一轮调用: 238
241 239 建议添加api.py部署后的日志管理功能? 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/518 ------------------------------------------------------------------------------- 239
242 240 有大佬知道,怎么多线程部署api.py脚本吗? 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/519 api.py部署后,使用下面的请求,时间较慢,好像是单线程,如何改为多线程部署api.py: 240
243 241 [BUG] 上传文件到知识库 任何格式与内容都永远失败 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/520 上传知识库的时候,传txt无法解析,就算是穿content/sample里的样例txt也无法解析,上传md、pdf等都无法加载,会持续性等待,等到了超过30分钟也不行。 241
244 242 关于prompt_template的问题 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/521 请问这段prompt_template是什么意思,要怎么使用?可以给一个具体模板参考下吗? 242
245 243 [BUG] 简洁阐述问题 / Concise description of the issue 2023-06-01.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/522 **问题描述 / Problem Description** 243
246 244 中文分词句号处理(关于表达金额之间的".") 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/523 建议处理12.6亿元的这样的分词,最好别分成12 和6亿这样的,需要放到一起 244
247 245 ImportError: cannot import name 'inference' from 'paddle' 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/526 在网上找了一圈,有说升级paddle的,我做了还是没有用,有说安装paddlepaddle的,我找了豆瓣的镜像源,但安装报错cannot detect archive format 245
248 246 [BUG] webscoket 接口串行问题(/local_doc_qa/stream-chat/{knowledge_base_id}) 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/527 **问题描述 / Problem Description** 246
249 247 [FEATURE] 刷新页面更新知识库列表 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/528 **功能描述以及改进方案** 247
250 248 [BUG] 使用ptuning微调模型后,问答效果并不好 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/530 ### 未调用ptuning 248
251 249 [BUG] 多轮对话效果不佳 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/532 在进行多轮对话的时候,无论设置的history_len是多少,效果都不好。事实上我将其设置成了最大值10,但在对话中,仍然无法实现多轮对话: 249
252 250 RuntimeError: MPS backend out of memory (MPS allocated: 18.00 GB, other allocations: 4.87 MB, max allowed: 18.13 GB) 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/533 **问题描述** 250
253 251 请大家重视这个issue!真正使用肯定是多用户并发问答,希望增加此功能!!! 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/534 这得看你有多少显卡 251
254 252 在启动项目的时候如何使用到多张gpu啊? 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/535 **在启动项目的时候如何使用到多张gpu啊?** 252
255 253 使用流式输出的时候,curl调用的格式是什么? 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/536 app.websocket("/local_doc_qa/stream-chat/{knowledge_base_id}")(stream_chat)中的knowledge_base_id应该填什么??? 253
256 254 使用本地 vicuna-7b模型启动错误 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/538 环境: ubuntu 22.04 cuda 12.1 没有安装nccl,使用rtx2080与m60显卡并行计算 254
257 255 为什么会不调用GPU直接调用CPU呢 2023-06-02.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/539 我的阿里云配置是16G显存,用默认代码跑webui.py时提示 255
258 256 上传多个文件时会互相覆盖 2023-06-03.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/541 1、在同一个知识库中上传多个文件时会互相覆盖,无法结合多个文档的知识,有大佬知道怎么解决吗? 256
259 257 [BUG] ‘gcc’不是内部或外部命令/LLM对话只能持续一轮 2023-06-03.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/542 No compiled kernel found. 257
260 258 以API模式启动项目却没有知识库的接口列表? 2023-06-04.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/544 请问如何获取知识库的接口列表?如果没有需要自行编写的话,可不可以提供相关的获取方式,感谢 258
261 259 程序以API模式启动的时候,如何才能让接口以stream模式被调用呢? 2023-06-05.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/546 作者您好,我在以API模式进行程序启动后,我发现接口响应时间很长,怎么样才能让接口以stream模式被调用呢?我想实现像webui模式的回答那样 259
262 260 关于原文中表格转为文本后数据相关度问题。 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/547 原文中表格数据转换为文本,以 (X-Y:值;...) 的格式每一行组织成一句话,但这样做后发现相关度较低,效果很差,有何好的方案吗? 260
263 261 启动后LLM和知识库问答模式均只有最后一轮记录 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/548 拉取最新代码,问答时,每次页面只显示最后一次问答记录,需要修改什么参数才可以保留历史记录? 261
264 262 提供system message配置,以便于让回答不要超出知识库范围 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/549 **功能描述 / Feature Description** 262
265 263 [BUG] 使用p-tunningv2报错 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/551 按照readme的指示把p-tunningv2训练完后的文件放到了p-tunningv2文件夹下,勾选使用p-tuningv2点重新加载模型,控制台提示错误信息: 263
266 264 [BUG] 智障,这么多问题,也好意思放出来,浪费时间 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/553 。。。 264
267 265 [FEATURE] 我看代码文件中有一个ali_text_splitter.py,为什么不用他这个文本分割器了? 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/554 我看代码文件中有一个ali_text_splitter.py,为什么不用他这个文本分割器了? 265
268 266 加载文档函数报错 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/557 def load_file(filepath, sentence_size=SENTENCE_SIZE): 266
269 267 参考指引安装docker后,运行cli_demo.py,提示killed 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/558 root@b3d1bd08095c:/chatGLM# python3 cli_demo.py 267
270 268 注意:如果安装错误,注意这两个包的版本 wandb==0.11.0 protobuf==3.18.3 2023-06-06.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/559 Error1: 如果启动异常报错 `protobuf` 需要更新到 `protobuf==3.18.3 ` 268
271 269 知识库对长文的知识相关度匹配不太理想有何优化方向 2023-06-07.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/563 我们可能录入一个文章有 1W 字,里面涉及这个文章主题的很多角度问题,我们针对他提问,他相关度匹配的内容和实际我们需要的答案相差很大怎么办。 269
272 270 使用stream-chat函数进行流式输出的时候,能使用curl调用吗? 2023-06-07.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/565 为什么下面这样调用会报错??? 270
273 271 有大佬实践过 并行 或者 多线程 的部署方案吗? 2023-06-07.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/566 +1 271
274 272 多线程部署遇到问题? 2023-06-07.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/567 <img width="615" alt="3d87bf74f0cf1a4820cc9e46b245859" src="https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/8787570d-88bd-434e-aaa4-cb9276d1aa50"> 272
275 273 [BUG] 用fastchat加载vicuna-13b模型进行知识库的问答有token的限制错误 2023-06-07.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/569 当我开启fastchat的vicuna-13b的api服务,然后config那里配置好(api本地测试过可以返回结果),然后知识库加载好之后(知识库大概有1000多个文档,用chatGLM可以正常推理),进行问答时出现token超过限制,就问了一句hello; 273
276 274 现在的添加知识库,文件多了总是报错,也不知道自己加载了哪些文件,报错后也不知道是全部失败还是一部分成功;希望能有个加载指定文件夹作为知识库的功能 2023-06-07.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/574 **功能描述 / Feature Description** 274
277 275 [BUG] moss模型本地加载报错 2023-06-08.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/577 moss模型本地加载报错: 275
278 276 加载本地moss模型报错Can't instantiate abstract class MOSSLLM with abstract methods _history_len 2023-06-08.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/578 (vicuna) ps@ps[13:56:20]:/data/chat/langchain-ChatGLM2/langchain-ChatGLM-0.1.13$ python webui.py --model-dir local_models --model moss --no-remote-model 276
279 277 [FEATURE] 能增加在前端页面控制prompt_template吗?或是能支持前端页面选择使用哪个prompt? 2023-06-08.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/579 目前只能在config里修改一个prompt,想在多个不同场景切换比较麻烦 277
280 278 [BUG] streamlit ui的bug,在增加知识库时会报错 2023-06-08.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/580 **问题描述 / Problem Description** 278
281 279 [FEATURE] webui/webui_st可以支持history吗?目前仅能一次对话 2023-06-08.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/581 试了下webui和webui_st都不支持历史对话啊,只能对话一次,不能默认开启所有history吗? 279
282 280 启动python cli_demo.py --model chatglm-6b-int4-qe报错 2023-06-09.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/585 下载好模型,和相关依赖环境,之间运行`python cli_demo.py --model chatglm-6b-int4-qe`报错了: 280
283 281 重新构建知识库报错 2023-06-09.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/586 **问题描述 / Problem Description** 281
284 282 [FEATURE] 能否屏蔽paddle,我不需要OCR,效果差依赖环境还很复杂 2023-06-09.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/587 希望能不依赖paddle 282
285 283 question :文档向量化这个可以自己手动实现么? 2023-06-09.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/589 现有公司级数据500G+,需要使用这个功能,请问如何手动实现这个向量化,然后并加载 283
286 284 view前端能进行流式的返回吗?? 2023-06-09.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/590 view前端能进行流式的返回吗?? 284
287 285 [BUG] Load parallel cpu kernel failed, using default cpu kernel code 2023-06-11.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/594 **问题描述 / Problem Description** 285
288 286 [BUG] 简洁阐述问题 / Concise description of the issue 2023-06-11.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/595 **问题描述 / Problem Description** 286
289 287 我在上传本地知识库时提示KeyError: 'name'错误,本地知识库都是.txt文件,文件数量大约是2000+。 2023-06-12.05 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/597 <img width="649" alt="KError" src="https://github.com/imClumsyPanda/langchain-ChatGLM/assets/59411575/1ecc8182-aeee-4a0a-bbc3-74c2f1373f2d"> 287
290 288 model_config.py中有vicuna-13b-hf模型的配置信息,但是好像还是不可用? 2023-06-12.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/600 @dongyihua543 288
291 289 ImportError: Using SOCKS proxy, but the 'socksio' package is not installed. Make sure to install httpx using `pip install httpx[socks]`. 2023-06-12.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/605 应该代理问题,但是尝试了好多方法都解决不了, 289
292 290 [BUG] similarity_search_with_score_by_vector在找不到匹配的情况下出错 2023-06-12.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/607 在设置匹配阈值 VECTOR_SEARCH_SCORE_THRESHOLD 的情况下,vectorstore会返回空,此时上述处理函数会出错 290
293 291 [FEATURE] 请问如何搭建英文知识库呢 2023-06-12.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/609 **功能描述 / Feature Description** 291
294 292 谁有vicuna权重?llama转换之后的 2023-06-13.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/611 **问题描述 / Problem Description** 292
295 293 [FEATURE] API能实现上传文件夹的功能么? 2023-06-13.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/612 用户懒得全选所有的文件,就想上传个文件夹,请问下API能实现这个功能么? 293
296 294 请问在多卡部署后,上传单个文件作为知识库,用的是单卡在生成向量还是多卡? 2023-06-13.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/614 目前我检测我本地多卡部署的,好像生成知识库向量的时候用的还是单卡 294
297 295 [BUG] python webui.py提示非法指令 2023-06-13.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/615 (/data/conda-langchain [root@chatglm langchain-ChatGLM]# python webui.py 295
298 296 知识库文件跨行切分问题 2023-06-13.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/616 我的知识库文件txt文件,是一行一条知识,用\n分行。 296
299 297 [FEATURE] bing搜索问答有流式的API么? 2023-06-13.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/617 web端是有这个bing搜索回答,但api接口没有发现,大佬能给个提示么? 297
300 298 希望出一个macos m2的安装教程 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/620 mac m2安装,模型加载成功了,知识库文件也上传成功了,但是一问答就会报错,报错内容如下 298
301 299 为【出处】提供高亮显示 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/621 具体出处里面,对相关的内容高亮显示,不包含前后文。 299
302 300 [BUG] CPU运行cli_demo.py,不回答,hang住 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/622 没有GPU;32G内存的ubuntu机器。 300
303 301 关于删除知识库里面的文档后,LLM知识库对话的时候还是会返回该被删除文档的内容 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/623 如题,在vue前端成功执行删除知识库里面文档A.txt后,未能也在faiss索引中也删除该文档,LLM还是会返回这个A.txt的内容,并且以A.txt为出处,未能达到删除的效果 301
304 302 [BUG] 调用知识库进行问答,显存会一直叠加 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/625 14G的显存,调用的chatglm-6b-int8模型,进行知识库问答时,最多问答四次就会爆显存了,观察了一下显存使用情况,每一次使用就会增加一次显存,请问这样是正常的吗?是否有什么配置需要开启可以解决这个问题?例如进行一次知识库问答清空上次问题的显存? 302
305 303 [BUG] web页面 重新构建数据库 失败,导致 原来的上传的数据库都没了 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/626 web页面 重新构建数据库 失败,导致 原来的上传的数据库都没了 303
306 304 在CPU上运行webui.py报错Tensor on device cpu is not on the expected device meta! 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/627 在CPU上运行python webui.py能启动,但最后有:RuntimeError: Tensor on device cpu is not on the expected device meta! 304
307 305 OSError: [WinError 1114] 动态链接库(DLL)初始化例程失败。 Error loading "E:\xxx\envs\langchain\lib\site-packages\torch\lib\caffe2_nvrtc.dll" or one of its dependencies.哪位大佬知道如何解决吗? 2023-06-14.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/629 **问题描述 / Problem Description** 305
308 306 [BUG] WEBUI删除知识库文档,会导致知识库问答失败 2023-06-15.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/632 如题,从知识库已有文件中选择要删除的文件,点击删除后,在问答框输入内容回车报错 306
309 307 更新后的版本中,删除知识库中的文件,再提问出现error错误 2023-06-15.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/634 针对更新版本,识别到一个问题,过程如下: 307
310 308 我配置好了环境,想要实现本地知识库的问答?可是它返回给我的 2023-06-15.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/637 没有总结,只有相关度的回复,但是我看演示里面表现的,回复是可以实现总结的,我去查询代码 308
311 309 [BUG] NPM run dev can not successfully start the VUE frontend 2023-06-15.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/638 **问题描述 / Problem Description** 309
312 310 [BUG] 简洁阐述问题 / Concise description of the issue 2023-06-15.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/639 **问题描述 / Problem Description** 310
313 311 提一个模型加载的bug,我在截图中修复了,你们有空可以看一下。 2023-06-15.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/642 ![model_load_bug](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/59411575/4432adc4-ccdd-45d9-aafc-5f2d1963403b) 311
314 312 [求助]关于设置embedding model路径的问题 2023-06-16.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/643 如题,我之前成功跑起来过一次,但因环境丢失重新配置 再运行webui就总是报错 312
315 313 Lora微调后的模型可以直接使用吗 2023-06-16.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/646 看model_config.py里是有USE_LORA这个参数的,但是在cli_demo.py和webui.py这两个里面都没有用到,实际测试下来模型没有微调的效果,想问问现在这个功能实现了吗 313
316 314 write_check_file在tmp_files目录下生成的load_file.txt是否需要一直保留,占用空间很大,在建完索引后能否删除 2023-06-16.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/647 **功能描述 / Feature Description** 314
317 315 [BUG] /local_doc_qa/list_files?knowledge_base_id=test删除知识库bug 2023-06-16.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/649 1.新建test知识库并上传文件(在vue前端完成并检查后端发现确实生成了test文件夹以及下面的content和vec_store 315
318 316 [BUG] vue webui无法加载知识库 2023-06-16.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/650 拉取了最新的代码,分别运行了后端api和前端web,点击知识库,始终只能显示simple,无法加载知识库 316
319 317 不能本地加载moss模型吗? 2023-06-16.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/652 手动下载模型设置local_model_path路径依旧提示缺少文件,该如何正确配置? 317
320 318 macos m2 pro docker 安装失败 2023-06-17.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/654 macos m2 pro docker 安装失败 318
321 319 [BUG] mac m1 pro 运行提示 zsh: segmentation fault 2023-06-17.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/655 运行: python webui.py 319
322 320 安装 requirements 报错 2023-06-17.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/656 (langchainchatglm) D:\github\langchain-ChatGLM>pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple/ 320
323 321 [BUG] AssertionError 2023-06-17.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/658 **问题描述 / Problem Description** 321
324 322 [FEATURE] 支持AMD win10 本地部署吗? 2023-06-18.06 https://github.com/imClumsyPanda/langchain-ChatGLM/issues/660 **功能描述 / Feature Description** 322

View File

@ -0,0 +1,323 @@
{"title": "效果如何优化", "file": "2023-04-04.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/14", "detail": "如图所示将该项目的README.md和该项目结合后回答效果并不理想请问可以从哪些方面进行优化", "id": 0}
{"title": "怎么让模型严格根据检索的数据进行回答,减少胡说八道的回答呢", "file": "2023-04-04.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/15", "detail": "举个例子:", "id": 1}
{"title": "When I try to run the `python knowledge_based_chatglm.py`, I got this error in macOS(M1 Max, OS 13.2)", "file": "2023-04-07.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/32", "detail": "```python", "id": 2}
{"title": "萌新求教大佬怎么改成AMD显卡或者CPU", "file": "2023-04-10.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/48", "detail": "把.cuda()去掉就行", "id": 3}
{"title": "输出answer的时间很长是否可以把文本向量化的部分提前做好存储起来", "file": "2023-04-10.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/50", "detail": "GPU4090 24G显存", "id": 4}
{"title": "报错Use `repo_type` argument if needed.", "file": "2023-04-11.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/57", "detail": "Traceback (most recent call last):", "id": 5}
{"title": "无法打开gradio的页面", "file": "2023-04-11.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/58", "detail": "$ python webui.py", "id": 6}
{"title": "支持word那word里面的图片正常显示吗", "file": "2023-04-12.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/60", "detail": "如题,刚刚从隔壁转过来的,想先了解下", "id": 7}
{"title": "detectron2 is not installed. Cannot use the hi_res partitioning strategy. Falling back to partitioning with the fast strategy.", "file": "2023-04-12.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/63", "detail": "能够正常的跑起来在加载content文件夹中的文件时每加载一个文件都会提示", "id": 8}
{"title": "cpu上运行webuistep3 asking时报错", "file": "2023-04-12.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/66", "detail": "web运行文件加载都正常asking时报错", "id": 9}
{"title": "建议弄一个插件系统", "file": "2023-04-13.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/67", "detail": "如题弄成stable-diffusion-webui那种能装插件再开一个存储库给使用者或插件开发存储或下载插件。", "id": 10}
{"title": "请教加载模型出错!?", "file": "2023-04-13.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/75", "detail": "AttributeError: module 'transformers_modules.chatglm-6b.configuration_chatglm' has no attribute 'ChatGLMConfig 怎么解决呀", "id": 11}
{"title": "从本地知识检索内容的时候是否可以设置相似度阈值小于这个阈值的内容不返回即使会小于设置的VECTOR_SEARCH_TOP_K参数呢谢谢大佬", "file": "2023-04-13.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/76", "detail": "比如 问一些 你好/你是谁 等一些跟本地知识库无关的问题", "id": 12}
{"title": "如何改成多卡推理?", "file": "2023-04-13.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/77", "detail": "+1", "id": 13}
{"title": "能否弄个懒人包,可以一键体验?", "file": "2023-04-13.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/78", "detail": "能否弄个懒人包,可以一键体验?", "id": 14}
{"title": "连续问问题会导致崩溃", "file": "2023-04-13.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/79", "detail": "看上去不是爆内存的问题,连续问问题后,会出现如下报错", "id": 15}
{"title": "AttributeError: 'NoneType' object has no attribute 'as_retriever'", "file": "2023-04-14.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/86", "detail": "环境windows 11, anaconda/python 3.8", "id": 16}
{"title": "FileNotFoundError: Could not find module 'nvcuda.dll' (or one of its dependencies). Try using the full path with constructor syntax.", "file": "2023-04-14.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/87", "detail": "请检查一下cuda或cudnn是否存在安装问题", "id": 17}
{"title": "加载txt文件失败", "file": "2023-04-14.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/89", "detail": "![JppHrGOWFa](https://user-images.githubusercontent.com/109277248/232009383-bf7c46d1-a01e-4e0a-9de6-5b5ed3e36158.jpg)", "id": 18}
{"title": "NameError: name 'chatglm' is not defined", "file": "2023-04-14.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/90", "detail": "This share link expires in 72 hours. For free permanent hosting and GPU upgrades (NEW!), check out Spaces: https://huggingface.co/spaces", "id": 19}
{"title": "打不开地址?", "file": "2023-04-14.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/91", "detail": "报错数据如下:", "id": 20}
{"title": "加载md文件出错", "file": "2023-04-14.00", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/98", "detail": "运行 webui.py后能访问页面上传一个md文件后日志中有错误。等待后能加载完成提示可以提问了但提问没反应日志中有错误。 具体日志如下。", "id": 21}
{"title": "建议增加获取在线知识的能力", "file": "2023-04-15.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/101", "detail": "建议增加获取在线知识的能力", "id": 22}
{"title": "txt 未能成功加载", "file": "2023-04-15.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/103", "detail": "hinese. Creating a new one with MEAN pooling.", "id": 23}
{"title": "pdf加载失败", "file": "2023-04-15.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/105", "detail": "e:\\a.txt加载成功了e:\\a.pdf加载就失败pdf文件里面前面几页是图片后面都是文字加载失败没有报更多错误请问该怎么排查", "id": 24}
{"title": "一直停在文本加载处", "file": "2023-04-15.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/108", "detail": "一直停在文本加载处", "id": 25}
{"title": " File \"/root/.cache/huggingface/modules/transformers_modules/chatglm-6b/modeling_chatglm.py\", line 440, in forward new_tensor_shape = mixed_raw_layer.size()[:-1] + ( TypeError: torch.Size() takes an iterable of 'int' (item 2 is 'float')", "file": "2023-04-17.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/113", "detail": "按照最新的代码,发现", "id": 26}
{"title": "后续会提供前后端分离的功能吗?", "file": "2023-04-17.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/114", "detail": "类似这种https://github.com/lm-sys/FastChat/tree/main/fastchat/serve", "id": 27}
{"title": "安装依赖报错", "file": "2023-04-17.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/115", "detail": "(test) C:\\Users\\linh\\Desktop\\langchain-ChatGLM-master>pip install -r requirements.txt", "id": 28}
{"title": "问特定问题会出现爆显存", "file": "2023-04-17.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/117", "detail": "正常提问没问题。", "id": 29}
{"title": "Expecting value: line 1 column 1 (char 0)", "file": "2023-04-17.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/118", "detail": "运行后 第一步加载配置一直报错:", "id": 30}
{"title": "embedding https://huggingface.co/GanymedeNil/text2vec-large-chinese/tree/main是免费的效果比对openai的如何", "file": "2023-04-17.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/119", "detail": "-------------------------------------------------------------------------------", "id": 31}
{"title": "这是什么错误在Colab上运行的。", "file": "2023-04-17.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/120", "detail": "libcuda.so.1: cannot open shared object file: No such file or directory", "id": 32}
{"title": "只想用自己的lora微调后的模型进行对话不想加载任何本地文档该如何调整", "file": "2023-04-18.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/121", "detail": "能出一个单独的教程吗", "id": 33}
{"title": "租的gpu,Running on local URL: http://0.0.0.0:7860 To create a public link, set `share=True` in `launch()`. 浏览器上访问不了???", "file": "2023-04-18.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/122", "detail": "(chatglm20230401) root@autodl-container-e82d11963c-10ece0d7:~/autodl-tmp/chatglm/langchain-ChatGLM-20230418# python3.9 webui.py", "id": 34}
{"title": "本地部署中的报错请教", "file": "2023-04-18.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/124", "detail": "您好在本地运行langchain-ChatGLM过程中环境及依赖的包都已经满足条件但是运行webui.py,报错如下运行cli_demo.py报错类似请问是哪里出了错呢盼望您的回复谢谢", "id": 35}
{"title": "报错。The dtype of attention mask (torch.int64) is not bool", "file": "2023-04-18.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/131", "detail": "The dtype of attention mask (torch.int64) is not bool", "id": 36}
{"title": "[求助] pip install -r requirements.txt 的时候出现以下报错。。。有大佬帮忙看看怎么搞么下的release里面的包", "file": "2023-04-18.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/134", "detail": "$ pip install -r requirements.txt", "id": 37}
{"title": "如何提升根据问题搜索到对应知识的准确率", "file": "2023-04-19.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/136", "detail": "外链知识库最大的问题在于问题是短文本,知识是中长文本。如何根据问题精准的搜索到对应的知识是个最大的问题。这类本地化项目不像百度,由无数的网页,基本上每个问题都可以找到对应的页面。", "id": 38}
{"title": "是否可以增加向量召回的阈值设定,有些召回内容相关性太低,导致模型胡言乱语", "file": "2023-04-20.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/140", "detail": "如题", "id": 39}
{"title": "输入长度问题", "file": "2023-04-20.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/141", "detail": "感谢作者支持ptuning微调模型。", "id": 40}
{"title": "已有部署好的chatGLM-6b如何通过接口接入", "file": "2023-04-20.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/144", "detail": "已有部署好的chatGLM-6b如何通过接口接入而不是重新加载一个模型", "id": 41}
{"title": "执行web_demo.py后显示Killed就退出了是不是配置不足呢", "file": "2023-04-20.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/146", "detail": "![图片](https://user-images.githubusercontent.com/26102866/233256425-c7aab999-11d7-4de9-867b-23ef18d519e4.png)", "id": 42}
{"title": "执行python cli_demo1.py", "file": "2023-04-20.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/147", "detail": "Traceback (most recent call last):", "id": 43}
{"title": "报错ImportError: cannot import name 'GENERATION_CONFIG_NAME' from 'transformers.utils'", "file": "2023-04-20.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/149", "detail": "(mychatGLM) PS D:\\Users\\admin3\\zrh\\langchain-ChatGLM> python cli_demo.py", "id": 44}
{"title": "上传文件并加载知识库时,会不停地出现临时文件", "file": "2023-04-21.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/153", "detail": "环境ubuntu 18.04", "id": 45}
{"title": "向知识库中添加文件后点击”上传文件并加载知识库“后Segmentation fault报错。", "file": "2023-04-23.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/161", "detail": "运行服务后的提示如下:", "id": 46}
{"title": "langchain-serve 集成", "file": "2023-04-24.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/162", "detail": "Hey 我是来自 [langchain-serve](https://github.com/jina-ai/langchain-serve) 的dev", "id": 47}
{"title": "大佬们wsl的ubuntu怎么配置用cuda加速装了运行后发现是cpu在跑", "file": "2023-04-24.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/164", "detail": "大佬们wsl的ubuntu怎么配置用cuda加速装了运行后发现是cpu在跑", "id": 48}
{"title": "在github codespaces docker运行出错", "file": "2023-04-24.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/165", "detail": "docker run -d --restart=always --name chatglm -p 7860:7860 -v /www/wwwroot/code/langchain-ChatGLM:/chatGLM chatglm", "id": 49}
{"title": "有计划接入Moss模型嘛", "file": "2023-04-24.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/166", "detail": "后续会开展测试目前主要在优化langchain部分效果如果有兴趣也欢迎提PR", "id": 50}
{"title": "怎么实现 API 部署?", "file": "2023-04-24.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/168", "detail": "利用 fastapi 实现 API 部署方式,具体怎么实现,有方法说明吗?", "id": 51}
{"title": " 'NoneType' object has no attribute 'message_types_by_name'报错", "file": "2023-04-24.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/169", "detail": "_HISTOGRAMPROTO = DESCRIPTOR.message_types_by_name['HistogramProto']", "id": 52}
{"title": "能否指定自己训练的text2vector模型", "file": "2023-04-25.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/172", "detail": "请问大佬:", "id": 53}
{"title": "关于项目支持的模型以及quantization_bit潜在的影响的问题", "file": "2023-04-26.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/176", "detail": "作者您好~", "id": 54}
{"title": "运行python3.9 api.py WARNING: You must pass the application as an import string to enable 'reload' or 'workers'.", "file": "2023-04-26.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/179", "detail": "api.py文件最下面改成这样试试", "id": 55}
{"title": "ValidationError: 1 validation error for HuggingFaceEmbeddings model_kwargs extra fields not permitted (type=value_error.extra)", "file": "2023-04-26.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/180", "detail": "ValidationError: 1 validation error for HuggingFaceEmbeddings", "id": 56}
{"title": "如果没有检索到相关性比较高的,回答“我不知道”", "file": "2023-04-26.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/181", "detail": "如果通过设计system_template让模型在搜索到的文档都不太相关的情况下回答“我不知道”", "id": 57}
{"title": "请问如果不能联网6B之类的文件从本地上传需要放到哪里", "file": "2023-04-26.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/182", "detail": "感谢大佬的项目,很有启发~", "id": 58}
{"title": "知识库问答--输入新的知识库名称是中文的话会报error", "file": "2023-04-27.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/184", "detail": "知识库问答--输入新的知识库名称是中文的话会报error选择要加载的知识库那里也不显示之前添加的知识库", "id": 59}
{"title": "现在能通过问题匹配的相似度值,来直接返回文档中的文段,而不经过模型吗?因为有些答案在文档中,模型自己回答,不能回答文档中的答案", "file": "2023-04-27.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/186", "detail": "现在能通过问题匹配的相似度值,来直接返回文档中的文段,而不经过模型吗?因为有些答案在文档中,模型自己回答,不能回答文档中的答案。也就是说,提供向量检索回答+模型回答相结合的策略。如果相似度值高于一定数值,直接返回文档中的文本,没有高于就返回模型的回答或者不知道", "id": 60}
{"title": "TypeError: The type of ChatGLM.callback_manager differs from the new default value; if you wish to change the type of this field, please use a type annotation", "file": "2023-04-27.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/188", "detail": "Mac 运行 python3 ./webui.py 报 TypeError: The type of ChatGLM.callback_manager differs from the new default value; if you wish to change the type of this field, please use a type annotation", "id": 61}
{"title": "Not Enough Memory", "file": "2023-04-27.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/190", "detail": "运行命令行程序python cli_demo.py 已经成功加载pdf文件, 报“DefaultCPUAllocator: not enough memory: you tried to allocate 458288380900 bytes”错误请问哪里可以配置default memory", "id": 62}
{"title": "参与开发问题", "file": "2023-04-27.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/191", "detail": "1.是否需要进专门的开发群", "id": 63}
{"title": "对话框中代码片段格式需改进", "file": "2023-04-27.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/192", "detail": "最好能改进下输出代码片段的格式,目前输出的格式还不友好。", "id": 64}
{"title": "请问未来有可能支持belle吗", "file": "2023-04-28.01", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/195", "detail": "如题,谢谢大佬", "id": 65}
{"title": "TypeError: cannot unpack non-iterable NoneType object", "file": "2023-04-28.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/200", "detail": "When i tried to change the knowledge vector store through `init_knowledge_vector_store`, the error `TypeError: cannot unpack non-iterable NoneType object` came out.", "id": 66}
{"title": "生成结果", "file": "2023-04-28.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/202", "detail": "你好想问一下langchain+chatglm-6B找到相似匹配的prompt是直接返回prompt对应的答案信息还是chatglm-6B在此基础上自己优化答案", "id": 67}
{"title": "在win、ubuntu下都出现这个错误attributeerror: 't5forconditionalgeneration' object has no attribute 'stream_chat'", "file": "2023-04-29.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/207", "detail": "在win、ubuntu。下载完模型后没办法修改代码以执行本地模型每次都要重新输入路径 LLM 模型、Embedding 模型支持也都在官网下的在其他项目wenda下可以使用", "id": 68}
{"title": "[FEATURE] knowledge_based_chatglm.py: renamed or missing?", "file": "2023-04-30.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/210", "detail": "Not found. Was it renamed? Or, is it missing? How can I get it?", "id": 69}
{"title": "sudo apt-get install -y nvidia-container-toolkit-base执行报错", "file": "2023-05-01.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/211", "detail": "**问题描述 / Problem Description**", "id": 70}
{"title": "效果不佳几乎答不上来", "file": "2023-05-01.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/212", "detail": "提供了50条问答的docx文件", "id": 71}
{"title": "有没有可能新增一个基于chatglm api调用的方式构建langchain", "file": "2023-05-02.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/218", "detail": "我有两台8G GPU/40G内存的服务器一个台做成了chatglm的api 想基于另外一台服务器部署langchain网上好像没有类似的代码。", "id": 72}
{"title": "电脑是intel的集成显卡 运行时告知我找不到nvcuda.dll模型无法运行", "file": "2023-05-02.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/219", "detail": "您好我的电脑是intel的集成显卡不过CPU是i5-11400 @ 2.60GHz 内存64G", "id": 73}
{"title": "根据langchain官方的文档和使用模式是否可以改Faiss为Elasticsearch会需要做哪些额外调整求解", "file": "2023-05-03.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/221", "detail": "本人新手小白由于业务模式的原因有一些自己的场景和优化希望利用Elasticsearch做这个体系内部的检索机制不知道是否可以替换同时还会涉及到哪些地方的改动或者说可能会有哪些其他影响希望作者和大佬们不吝赐教", "id": 74}
{"title": "请问未来有可能支持t5吗", "file": "2023-05-04.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/224", "detail": "请问可能支持基於t5的模型吗?", "id": 75}
{"title": "[BUG] 内存溢出 / torch.cuda.OutOfMemoryError:", "file": "2023-05-04.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/229", "detail": "**问题描述 / Problem Description**", "id": 76}
{"title": "报错 No module named 'chatglm_llm'", "file": "2023-05-04.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/230", "detail": "明明已经安装了包却在python里吊不出来", "id": 77}
{"title": "能出一个api部署的描述文档吗", "file": "2023-05-04.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/233", "detail": "**功能描述 / Feature Description**", "id": 78}
{"title": "使用docs/API.md 出错", "file": "2023-05-04.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/234", "detail": "使用API.md文档2种方法出错", "id": 79}
{"title": "加载pdf文档报错", "file": "2023-05-05.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/238", "detail": "ew one with MEAN pooling.", "id": 80}
{"title": "上传的本地知识文件后再次上传不能显示,只显示成功了一个,别的上传成功后再次刷新就没了", "file": "2023-05-05.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/239", "detail": "您好,项目有很大启发,感谢~", "id": 81}
{"title": "创建了新的虚拟环境安装了相关包并且自动下载了相关的模型但是仍旧出现OSError: Unable to load weights from pytorch checkpoint file for", "file": "2023-05-05.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/240", "detail": "![78ac8e663fdc312d0e9d78da95925c4](https://user-images.githubusercontent.com/34124260/236378728-9ea4424f-0f7f-4013-9d33-820b723de321.png)", "id": 82}
{"title": "[BUG] 数据加载不进来", "file": "2023-05-05.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/243", "detail": "使用的.txt格式utf-8编码报以下错误", "id": 83}
{"title": "不能读取pdf", "file": "2023-05-05.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/244", "detail": "请问是webui还是cli_demo", "id": 84}
{"title": "本地txt文件有500M加载的时候很慢如何提高速度", "file": "2023-05-06.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/251", "detail": "![yayRzxSYHP](https://user-images.githubusercontent.com/109277248/236592902-f5ab338d-c1e9-43dc-ae16-9df2cd3c1378.jpg)", "id": 85}
{"title": "[BUG] gradio上传知识库后刷新之后 知识库就不见了 只有重启才能看到之前的上传的知识库", "file": "2023-05-06.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/253", "detail": "gradio上传知识库后刷新之后 知识库就不见了 只有重启才能看到之前的上传的知识库", "id": 86}
{"title": "[FEATURE] 可以支持 OpenAI 的模型嘛?比如 GPT-3、GPT-3.5、GPT-4embedding 增加 text-embedding-ada-002", "file": "2023-05-06.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/254", "detail": "**功能描述 / Feature Description**", "id": 87}
{"title": "[FEATURE] 能否增加对于milvus向量数据库的支持 / Concise description of the feature", "file": "2023-05-06.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/256", "detail": "**功能描述 / Feature Description**", "id": 88}
{"title": "CPU和GPU上跑除了速度有区别准确率效果回答上有区别吗", "file": "2023-05-06.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/259", "detail": "理论上没有区别", "id": 89}
{"title": "m1请问在生成回答时怎么看是否使用了mps or cpu", "file": "2023-05-06.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/260", "detail": "m1请问在生成回答时怎么看是否使用了mps or cpu", "id": 90}
{"title": "知识库一刷新就没了", "file": "2023-05-07.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/263", "detail": "知识库上传后刷新就没了", "id": 91}
{"title": "本地部署报没有模型", "file": "2023-05-07.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/267", "detail": "建议在下载llm和embedding模型至本地后在configs/model_config中写入模型本地存储路径后再运行", "id": 92}
{"title": "[BUG] python3: can't open file 'webui.py': [Errno 2] No such file or directory", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/269", "detail": "**问题描述 / Problem Description**", "id": 93}
{"title": "模块缺失提示", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/271", "detail": "因为已有自己使用的docker环境直接启动webui.py提示", "id": 94}
{"title": "运行api.py后执行curl -X POST \"http://127.0.0.1:7861\" 报错?", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/272", "detail": "执行curl -X POST \"http://127.0.0.1:7861\" \\ -H 'Content-Type: application/json' \\ -d '{\"prompt\": \"你好\", \"history\": []}',报错怎么解决", "id": 95}
{"title": "[BUG] colab安装requirements提示protobuf版本问题", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/273", "detail": "pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.", "id": 96}
{"title": "请问项目里面向量相似度使用了什么方法计算呀?", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/275", "detail": "基本按照langchain里的FAISS.similarity_search_with_score_by_vector实现", "id": 97}
{"title": "[BUG] 安装detectron2后pdf无法加载", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/276", "detail": "**问题描述 / Problem Description**", "id": 98}
{"title": "[BUG] 使用ChatYuan-V2模型无法流式输出会报错", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/277", "detail": "一方面好像是ChatYuan本身不支持stream_chat有人在clueai那边提了issue他们说还没开发所以估计这个attribute调不起来但是另一方面看报错好像是T5模型本身就不是decoder-only模型所以不能流式输出吧个人理解", "id": 99}
{"title": "[BUG] 无法加载text2vec模型", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/278", "detail": "**问题描述 / Problem Description**", "id": 100}
{"title": "请问能否增加网络搜索功能", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/281", "detail": "请问能否增加网络搜索功能", "id": 101}
{"title": "[FEATURE] 结构化数据sql、excel、csv啥时会支持呐。", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/283", "detail": "**功能描述 / Feature Description**", "id": 102}
{"title": "TypeError: ChatGLM._call() got an unexpected keyword argument 'stop'", "file": "2023-05-08.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/284", "detail": "No sentence-transformers model found with name D:\\DevProject\\langchain-ChatGLM\\GanymedeNil\\text2vec-large-chinese. Creating a new one with MEAN pooling.", "id": 103}
{"title": "关于api.py的一些bug和设计逻辑问题", "file": "2023-05-09.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/285", "detail": "首先冒昧的问一下这个api.py开发者大佬们是在自己电脑上测试后确实没问题吗", "id": 104}
{"title": "有没有租用的算力平台上运行api.py后浏览器http://localhost:7861/报错", "file": "2023-05-09.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/287", "detail": "是不是租用的gpu平台上都会出现这个问题", "id": 105}
{"title": "请问一下项目中有用到文档段落切割方法吗?", "file": "2023-05-09.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/288", "detail": "text_load中的文档切割方法用上了吗在代码中看好像没有用到", "id": 106}
{"title": "报错 raise ValueError(f\"Knowledge base {knowledge_base_id} not found\") ValueError: Knowledge base ./vector_store not found", "file": "2023-05-09.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/289", "detail": "File \"/root/autodl-tmp/chatglm/langchain-ChatGLM-master/api.py\", line 183, in chat", "id": 107}
{"title": "能接入vicuna模型吗", "file": "2023-05-09.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/290", "detail": "目前本地已经有了vicuna模型能直接接入吗", "id": 108}
{"title": "[BUG] 提问公式相关问题大概率爆显存", "file": "2023-05-09.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/291", "detail": "**问题描述 / Problem Description**", "id": 109}
{"title": "安装pycocotools失败找了好多方法都不能解决。", "file": "2023-05-10.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/292", "detail": "**问题描述 / Problem Description**", "id": 110}
{"title": "使用requirements安装PyTorch安装的是CPU版本", "file": "2023-05-10.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/294", "detail": "如题目使用requirements安装PyTorch安装的是CPU版本运行程序的时候也是使用CPU在工作。", "id": 111}
{"title": "能不能给一个毛坯服务器的部署教程", "file": "2023-05-10.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/298", "detail": "“开发部署”你当成服务器的部署教程用就行了。", "id": 112}
{"title": " Error(s) in loading state_dict for ChatGLMForConditionalGeneration:", "file": "2023-05-10.02", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/299", "detail": "运行中出现的问题7860的端口页面显示不出来求助。", "id": 113}
{"title": "ChatYuan-large-v2模型加载失败", "file": "2023-05-10.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/300", "detail": "**实际结果 / Actual Result**", "id": 114}
{"title": "新增摘要功能", "file": "2023-05-10.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/303", "detail": "你好,后续会考虑新增对长文本信息进行推理和语音理解功能吗?比如生成摘要", "id": 115}
{"title": "[BUG] pip install -r requirements.txt 出错", "file": "2023-05-10.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/304", "detail": "pip install langchain -i https://pypi.org/simple", "id": 116}
{"title": "[BUG] 上传知识库文件报错", "file": "2023-05-10.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/305", "detail": "![19621e29eaa547d01213bee53d81e6a](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/84606552/7f6ceb46-e494-4b0e-939c-23b585a6d9d8)", "id": 117}
{"title": "[BUG] AssertionError: <class 'gradio.layouts.Accordion'> Component with id 41 not a valid input component.", "file": "2023-05-10.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/306", "detail": "**问题描述 / Problem Description**", "id": 118}
{"title": "[BUG] CUDA out of memory with container deployment", "file": "2023-05-10.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/310", "detail": "**问题描述 / Problem Description**", "id": 119}
{"title": "[FEATURE] 增加微调训练功能", "file": "2023-05-11.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/311", "detail": "**功能描述 / Feature Description**", "id": 120}
{"title": "如何使用多卡部署多个gpu", "file": "2023-05-11.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/315", "detail": "机器上有多个gpu,如何全使用了", "id": 121}
{"title": "请问这个知识库问答和chatglm的关系是什么", "file": "2023-05-11.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/319", "detail": "这个知识库问答哪部分关联到了chatglm是不是没有这个chatglm知识库问答也可单单拎出来", "id": 122}
{"title": "[BUG] 运行的时候报错ImportError: libcudnn.so.8: cannot open shared object file: No such file or directory", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/324", "detail": "**问题描述 / Problem Description**raceback (most recent call last):", "id": 123}
{"title": "webui启动成功但有报错", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/325", "detail": "**问题描述 / Problem Description**", "id": 124}
{"title": "切换MOSS的时候报错", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/327", "detail": "danshi但是发布的源码中", "id": 125}
{"title": "vicuna模型是否能接入", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/328", "detail": "您好关于MOSS模型和vicuna模型都是AutoModelForCausalLM来加载模型的但是稍作更改模型路径这些会报这个错误。这个错误的造成是什么", "id": 126}
{"title": "你好请问一下在阿里云CPU服务器上跑可以吗可以的话比较理想的cpu配置是什么", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/330", "detail": "你好请问一下在阿里云CPU服务器上跑可以吗可以的话比较理想的cpu配置是什么", "id": 127}
{"title": "你好请问8核32g的CPU可以跑多轮对话吗", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/331", "detail": "什么样的cpu配置比较好呢我目前想部署CPU下的多轮对话", "id": 128}
{"title": "[BUG] 聊天内容输入超过10000个字符系统出现错误", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/332", "detail": "聊天内容输入超过10000个字符系统出现错误如下图所示", "id": 129}
{"title": "能增加API的多用户访问接口部署吗", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/333", "detail": "默认部署程序仅支持单用户访问多用户则需要排队访问。测试过相关的几个Github多用户工程但是其中一些仍然不满足要求。本节将系统介绍如何实现多用户同时访问ChatGLM的部署接口包括http、websocket流式输出stream和web页面等方式主要目录如下所示。", "id": 130}
{"title": "多卡部署", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/334", "detail": "用单机多卡或多机多卡fastapi部署模型怎样提高并发", "id": 131}
{"title": "WEBUI能否指定知识库目录", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/335", "detail": "**功能描述 / Feature Description**", "id": 132}
{"title": "[BUG] Cannot read properties of undefined (reading 'error')", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/336", "detail": "**问题描述 / Problem Description**", "id": 133}
{"title": "[BUG] 1 validation error for HuggingFaceEmbeddings model_kwargs extra fields not permitted.", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/337", "detail": "模型加载到 100% 后出现问题:", "id": 134}
{"title": "上传知识库需要重启能不能修复一下", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/338", "detail": "挺严重的这个问题", "id": 135}
{"title": "[BUG] 4块v100卡爆显存在LLM会话模式也一样", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/339", "detail": "**问题描述 / Problem Description**", "id": 136}
{"title": "针对上传的文件配置不同的TextSpliter", "file": "2023-05-12.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/341", "detail": "1. 目前的ChineseTextSpliter切分对英文尤其是代码文件不友好而且限制固定长度导致对话结果不如人意", "id": 137}
{"title": "[FEATURE] 未来可增加Bloom系列模型吗根据甲骨易的测试这系列中文评测效果不错", "file": "2023-05-13.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/346", "detail": "**功能描述 / Feature Description**", "id": 138}
{"title": "[BUG] v0.1.12打包镜像后启动webui.py失败 / Concise description of the issue", "file": "2023-05-13.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/347", "detail": "**问题描述 / Problem Description**", "id": 139}
{"title": "切换MOSS模型时报错", "file": "2023-05-13.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/349", "detail": "昨天问了下说是transformers版本不对需要4.30.0发现没有这个版本今天更新到4.29.1,依旧报错,错误如下", "id": 140}
{"title": "[BUG] pdf文档加载失败", "file": "2023-05-13.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/350", "detail": "**问题描述 / Problem Description**", "id": 141}
{"title": "建议可以在后期增强一波注释这样也有助于更多人跟进提PR", "file": "2023-05-13.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/351", "detail": "知道作者和团队在疯狂更新审查代码,只是建议后续稳定后可以把核心代码进行一些注释的补充,从而能帮助更多人了解各个模块作者的思路从而提出更好的优化。", "id": 142}
{"title": "[FEATURE] MOSS 量化版支援", "file": "2023-05-13.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/353", "detail": "**功能描述 / Feature Description**", "id": 143}
{"title": "[BUG] moss模型无法加载", "file": "2023-05-13.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/356", "detail": "**问题描述 / Problem Description**", "id": 144}
{"title": "[BUG] load_doc_qa.py 中的 load_file 函数有bug", "file": "2023-05-14.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/358", "detail": "原函数为:", "id": 145}
{"title": "[FEATURE] API模式知识库加载优化", "file": "2023-05-14.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/359", "detail": "如题,当前版本,每次调用本地知识库接口,都将加载一次知识库,是否有更好的方式?", "id": 146}
{"title": "运行Python api.py脚本后端部署后怎么使用curl命令调用", "file": "2023-05-15.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/361", "detail": "也就是说,我现在想做个对话机器人,想和公司的前后端联调?怎么与前后端相互调用呢?可私信,有偿解答!!!", "id": 147}
{"title": "上传知识库需要重启能不能修复一下", "file": "2023-05-15.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/363", "detail": "上传知识库需要重启能不能修复一下", "id": 148}
{"title": "[BUG] pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple", "file": "2023-05-15.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/364", "detail": "我的python是3.8.5的", "id": 149}
{"title": "pip install gradio 报错", "file": "2023-05-15.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/367", "detail": "大佬帮我一下", "id": 150}
{"title": "[BUG] pip install gradio 一直卡不动", "file": "2023-05-15.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/369", "detail": "![aba82742dd9d4d242181662eb5027a7](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/84606552/cd9600d9-f6e7-46b7-b1be-30ed8b99f76b)", "id": 151}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/370", "detail": "初次加载本地知识库成功,但提问后,就无法重写加载本地知识库", "id": 152}
{"title": "[FEATURE] 简洁阐述功能 / Concise description of the feature", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/371", "detail": "**功能描述 / Feature Description**", "id": 153}
{"title": "在windows上模型文件默认会安装到哪", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/372", "detail": "-------------------------------------------------------------------------------", "id": 154}
{"title": "[FEATURE] 兼顾对话管理", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/374", "detail": "如何在知识库检索的情况下,兼顾对话管理?", "id": 155}
{"title": "llm device: cpu embedding device: cpu", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/376", "detail": "**问题描述 / Problem Description**", "id": 156}
{"title": "[FEATURE] 简洁阐述功能 /文本文件的知识点之间使用什么分隔符可以分割?", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/377", "detail": "**功能描述 / Feature Description**", "id": 157}
{"title": "[BUG] 上传文件失败PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/379", "detail": "**问题描述 / Problem Description**", "id": 158}
{"title": "[BUG] 执行python api.py 报错", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/383", "detail": "错误信息", "id": 159}
{"title": "model_kwargs extra fields not permitted (type=value_error.extra)", "file": "2023-05-16.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/384", "detail": "大家好,请问这个有遇到的么,", "id": 160}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-05-17.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/385", "detail": "执行的时候出现了ls1 = [ls[0]]", "id": 161}
{"title": "[FEATURE] 性能优化", "file": "2023-05-17.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/388", "detail": "**功能描述 / Feature Description**", "id": 162}
{"title": "[BUG] Moss模型问答RuntimeError: probability tensor contains either inf, nan or element < 0", "file": "2023-05-17.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/390", "detail": "**问题描述 / Problem Description**", "id": 163}
{"title": "有没有人知道v100GPU的32G显存会报错吗支持V100GPU吗", "file": "2023-05-17.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/392", "detail": "**问题描述 / Problem Description**", "id": 164}
{"title": "针对于编码问题比如'gbk' codec can't encode character '\\xab' in position 14: illegal multibyte sequence粗浅的解决方法", "file": "2023-05-17.03", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/397", "detail": "**功能描述 / Feature Description**", "id": 165}
{"title": "Could not import sentence_transformers python package. Please install it with `pip install sentence_transformers`.", "file": "2023-05-18.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/400", "detail": "**问题描述 / Problem Description**", "id": 166}
{"title": "支持模型问答与检索问答", "file": "2023-05-18.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/401", "detail": "不同的query根据意图不一致回答也应该不一样。", "id": 167}
{"title": "文本分割的时候能不能按照txt文件的每行进行分割也就是按照换行符号\\n进行分割", "file": "2023-05-18.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/403", "detail": "下面的代码应该怎么修改?", "id": 168}
{"title": "local_doc_qa/local_doc_chat 接口响应是串行", "file": "2023-05-18.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/405", "detail": "**问题描述 / Problem Description**", "id": 169}
{"title": "为什么找到出处了,但是还是无法回答该问题?", "file": "2023-05-18.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/406", "detail": "![图片](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/3349611/1fc81d61-2409-4330-9065-fdda1a27c86a)", "id": 170}
{"title": "请问下:知识库测试中的:添加单条内容,如果换成文本导入是是怎样的格式?我发现添加单条内容测试效果很好.", "file": "2023-05-18.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/412", "detail": "我发现在知识库测试中`添加单条内容`,并且勾选`禁止内容分句入库`,即使 `不开启上下文关联`的测试效果都非常好.", "id": 171}
{"title": "[BUG] 无法配置知识库", "file": "2023-05-18.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/413", "detail": "**问题描述 / Problem Description**", "id": 172}
{"title": "[BUG] 部署在阿里PAI平台的EAS上访问页面是白屏", "file": "2023-05-19.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/414", "detail": "**问题描述 / Problem Description**", "id": 173}
{"title": "API部署后调用/local_doc_qa/local_doc_chat 返回Knowledge base samples not found", "file": "2023-05-19.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/416", "detail": "入参", "id": 174}
{"title": "[FEATURE] 上传word另存为的txt文件报 'ascii' codec can't decode byte 0xb9 in position 6: ordinal not in range(128)", "file": "2023-05-20.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/421", "detail": "上传word另存为的txt文件报", "id": 175}
{"title": "创建保存的知识库刷新后没有出来,这个知识库是永久保存的吗?可以连外部的 向量知识库吗?", "file": "2023-05-21.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/422", "detail": "创建保存的知识库刷新后没有出来,这个知识库是永久保存的吗?可以连外部的 向量知识库吗?", "id": 176}
{"title": "[BUG] 用colab运行无法加载模型报错'NoneType' object has no attribute 'message_types_by_name'", "file": "2023-05-21.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/423", "detail": "**问题描述 / Problem Description**", "id": 177}
{"title": "请问是否需要用到向量数据库?以及什么时候需要用到向量数据库?", "file": "2023-05-21.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/424", "detail": "目前用的是 text2vec 请问是否需要用到向量数据库?以及什么时候需要用到向量数据库?", "id": 178}
{"title": "huggingface模型引用问题", "file": "2023-05-22.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/427", "detail": "它最近似乎变成了一个Error", "id": 179}
{"title": "你好加载本地txt文件出现这个killed错误TXT文件有100M左右大小。原因是谢谢。", "file": "2023-05-22.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/429", "detail": "<img width=\"677\" alt=\"929aca3b22b8cd74e997a87b61d241b\" src=\"https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/24024522-c884-4170-b5cf-a498491bd8bc\">", "id": 180}
{"title": "想请问一下关于对本地知识的管理是如何管理例如通过http API接口添加数据 或者 删除某条数据", "file": "2023-05-22.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/430", "detail": "例如通过http API接口添加、删除、修改 某条数据。", "id": 181}
{"title": "[FEATURE] 双栏pdf识别问题", "file": "2023-05-22.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/432", "detail": "试了一下模型感觉对单栏pdf识别的准确性较高但是由于使用的基本是ocr的技术对一些双栏pdf论文识别出来有很多问题请问有什么办法改善吗", "id": 182}
{"title": "部署启动小问题,小弟初学求大佬解答", "file": "2023-05-22.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/433", "detail": "1.python loader/image_loader.py时提示ModuleNotFoundError: No module named 'configs'但是跑python webui.py还是还能跑", "id": 183}
{"title": "能否支持检测到目录下文档有增加而去增量加载文档,不影响前台对话,其实就是支持读写分离。如果能支持查询哪些文档向量化了,删除过时文档等就更好了,谢谢。", "file": "2023-05-22.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/434", "detail": "**功能描述 / Feature Description**", "id": 184}
{"title": "[BUG] 简洁阐述问题 / windows 下cuda错误请用https://github.com/Keith-Hon/bitsandbytes-windows.git", "file": "2023-05-22.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/435", "detail": "pip install git+https://github.com/Keith-Hon/bitsandbytes-windows.git", "id": 185}
{"title": "[BUG] from commit 33bbb47, Required library version not found: libbitsandbytes_cuda121_nocublaslt.so. Maybe you need to compile it from source?", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/438", "detail": "**问题描述 / Problem Description**", "id": 186}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue上传60m的txt文件报错显示超时请问这个能上传的文件大小有限制吗", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/439", "detail": "ERROR 2023-05-23 11:13:09,627-1d: Timeout reached while detecting encoding for ./docs/GLM模型格式数据.txt", "id": 187}
{"title": "[BUG] TypeError: issubclass() arg 1 must be a class", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/440", "detail": "**问题描述**", "id": 188}
{"title": "执行python3 webui.py后一直提示”模型未成功加载请到页面左上角\"模型配置\"选项卡中重新选择后点击\"加载模型\"按钮“", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/441", "detail": "**问题描述 / Problem Description**", "id": 189}
{"title": "是否能提供网页文档得导入支持", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/444", "detail": "现在很多都是在线文档作为协作得工具所以通过URL导入在线文档需求更大", "id": 190}
{"title": "[BUG] history 索引问题", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/445", "detail": "在比较对话框的history和模型chat function 中的history时 发现并不匹配,在传入 llm._call 时history用的索引是不是有点问题导致上一轮对话的内容并不输入给模型。", "id": 191}
{"title": "[BUG] moss_llm没有实现", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/447", "detail": "有些方法没支持如history_len", "id": 192}
{"title": "请问langchain-ChatGLM如何删除一条本地知识库的数据", "file": "2023-05-23.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/448", "detail": "例如:用户刚刚提交了一条错误的数据到本地知识库中了,现在如何在本地知识库从找到,并且对此删除。", "id": 193}
{"title": "[BUG] 简洁阐述问题 / UnboundLocalError: local variable 'resp' referenced before assignment", "file": "2023-05-24.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/450", "detail": "在最新一版的代码中, 运行api.py 出现了以上错误UnboundLocalError: local variable 'resp' referenced before assignment 通过debug的方式观察到local_doc_qa.llm.generatorAnswer(prompt=question, history=history,streaming=True)可能不返回任何值。", "id": 194}
{"title": "请问有没有 PROMPT_TEMPLATE 能让模型不回答敏感问题", "file": "2023-05-24.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/452", "detail": "## PROMPT_TEMPLATE问题", "id": 195}
{"title": "[BUG] 测试环境 Python 版本有误", "file": "2023-05-24.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/456", "detail": "**问题描述 / Problem Description**", "id": 196}
{"title": "[BUG] webui 部署后样式不正确", "file": "2023-05-24.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/458", "detail": "**问题描述 / Problem Description**", "id": 197}
{"title": "配置默认LLM模型的问题", "file": "2023-05-24.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/459", "detail": "**问题描述 / Problem Description**", "id": 198}
{"title": "[FEATURE]是时候更新一下autoDL的镜像了", "file": "2023-05-24.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/460", "detail": "如题跑了下autoDL的镜像发现是4.27号的git pull新版本的代码功能+老的依赖环境,各种奇奇怪怪的问题。", "id": 199}
{"title": "[BUG] tag:0.1.13 以cpu模式下想使用本地模型无法跑起来各种路径参数问题", "file": "2023-05-24.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/462", "detail": "-------------------------------------------------------------------------------", "id": 200}
{"title": "[BUG] 有没有同学遇到过这个错加载本地txt文件出现这个killed错误TXT文件有100M左右大小。", "file": "2023-05-25.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/463", "detail": "运行cli_demo.py。是本地的txt文件太大了吗100M左右。", "id": 201}
{"title": "API版本能否提供WEBSOCKET的流式接口", "file": "2023-05-25.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/464", "detail": "webui 版本中采用了WS的流式输出整体感知反应很快", "id": 202}
{"title": "[BUG] 安装bug记录", "file": "2023-05-25.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/465", "detail": "按照[install文档](https://github.com/imClumsyPanda/langchain-ChatGLM/blob/master/docs/INSTALL.md)安装的,", "id": 203}
{"title": "VUE的pnmp i执行失败的修复-用npm i命令即可", "file": "2023-05-25.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/466", "detail": "感谢作者!非常棒的应用,用的很开心。", "id": 204}
{"title": "请教个问题有没有人知道cuda11.4是否支持???", "file": "2023-05-25.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/467", "detail": "请教个问题有没有人知道cuda11.4是否支持???", "id": 205}
{"title": "请问有实现多轮问答中基于问题的搜索上下文关联么", "file": "2023-05-25.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/468", "detail": "在基于知识库的多轮问答中,第一个问题讲述了一个主题,后续的问题描述没有包含这个主题的关键词,但又存在上下文的关联。如果用后续问题去搜索知识库有可能会搜索出无关的信息,从而导致大模型无法正确回答问题。请问这个项目要考虑这种情况吗?", "id": 206}
{"title": "[BUG] 内存不足的问题", "file": "2023-05-26.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/470", "detail": "我用了本地的chatglm-6b-int4模型然后显示了内存不足win10+32G内存+1080ti11G一般需要多少内存才足够这个bug应该如何解决", "id": 207}
{"title": "[BUG] 纯内网环境安装pycocotools失败", "file": "2023-05-26.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/472", "detail": "**问题描述 / Problem Description**", "id": 208}
{"title": "[BUG] webui.py 重新加载模型会导致 KeyError", "file": "2023-05-26.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/473", "detail": "**问题描述 / Problem Description**", "id": 209}
{"title": "chatyuan无法使用", "file": "2023-05-26.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/475", "detail": "**问题描述 / Problem Description**", "id": 210}
{"title": "[BUG] 文本分割模型AliTextSplitter存在bug会把“.”作为分割符", "file": "2023-05-26.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/476", "detail": "阿里达摩院的语义分割模型存在bug默认会把\".”作为分割符进行分割而不管上下文语义。是否还有其他分割符则未知。建议的修改方案:把“.”统一替换为其他字符,分割后再替换回来。或者添加其他分割模型。", "id": 211}
{"title": "[BUG] RuntimeError: Error in faiss::FileIOReader::FileIOReader(const char*) a", "file": "2023-05-27.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/479", "detail": "**问题描述 / Problem Description**", "id": 212}
{"title": "[FEATURE] 安装为什么conda create要额外指定路径 用-p ,而不是默认的/envs下面", "file": "2023-05-28.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/481", "detail": "##**功能描述 / Feature Description**", "id": 213}
{"title": "[小白求助] 通过Anaconda执行webui.py后无法打开web链接", "file": "2023-05-28.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/485", "detail": "在执行webui.py命令后http://0.0.0.0:7860复制到浏览器后无法打开显示“无法访问此网站”。", "id": 214}
{"title": "[BUG] 使用 p-tuningv2后的模型重新加载报错", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/486", "detail": "把p-tunningv2训练完后的相关文件放到了p-tunningv2文件夹下勾选使用p-tuningv2点重新加载模型控制台输错错误信息", "id": 215}
{"title": "[小白求助] 服务器上执行webui.py后在本地无法打开web链接", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/487", "detail": "此项目执行在xxx.xx.xxx.xxx服务器上我在webui.py上的代码为 (demo", "id": 216}
{"title": "[FEATURE] 能不能支持VisualGLM-6B", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/488", "detail": "**功能描述 / Feature Description**", "id": 217}
{"title": "你好问一下各位后端api部署的时候支持多用户同时问答吗", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/489", "detail": "支持多用户的话,最多支持多少用户问答?根据硬件而定吧?", "id": 218}
{"title": "V100GPU显存占满而利用率却为0这是为什么", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/491", "detail": "<img width=\"731\" alt=\"de45fe2b6cb76fa091b6e8f76a3de60\" src=\"https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/c32efd52-7dbf-4e9b-bd4d-0944d73d0b8b\">", "id": 219}
{"title": "[求助] 如果在公司内部搭建产品知识库使用INT-4模型200人规模需要配置多少显存的服务器", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/492", "detail": "如题,计划给公司搭一个在线知识库。", "id": 220}
{"title": "你好请教个问题目前问答回复需要20秒左右如何提高速度V10032G服务器。", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/493", "detail": "**问题描述 / Problem Description**", "id": 221}
{"title": "[FEATURE] 如何实现只匹配下文,而不要上文的结果", "file": "2023-05-29.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/494", "detail": "在构建自己的知识库时主要采用问答对的形式那么也就是我需要的回答是在我的问题下面的内容但是目前设置了chunk_size的值以后匹配的是上下文的内容但我实际并不需要上文的。为了实现更完整的展示下面的答案我只能调大chunk_size的值但实际上上文的一半内容都是我不需要的。也就是扔了一半没用的东西给prompt在faiss.py中我也没找到这块的一些描述请问该如何进行修改呢", "id": 222}
{"title": "你好问一下我调用api.py部署为什么用ip加端口可以使用postman调用而改为域名使用postman无法调用", "file": "2023-05-30.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/497", "detail": "![5ufBSWxLyF](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/70e2fbac-5699-48d0-b0d1-3dc84fd042c2)", "id": 223}
{"title": "调用api.py中的stream_chat返回source_documents中出现中文乱码。", "file": "2023-05-30.04", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/498", "detail": "-------------------------------------------------------------------------------", "id": 224}
{"title": "[BUG] 捉个虫api.py中的stream_chat解析json问题", "file": "2023-05-30.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/501", "detail": "**问题描述 / Problem Description**", "id": 225}
{"title": "windows本地部署遇到了omp错误", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/502", "detail": "**问题描述 / Problem Description**", "id": 226}
{"title": "[BUG] bug14 ,\"POST /local_doc_qa/upload_file HTTP/1.1\" 422 Unprocessable Entity", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/503", "detail": "上传的文件报错返回错误api.py", "id": 227}
{"title": "你好请教个问题api.py部署的时候如何改为多线程调用谢谢", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/505", "detail": "目前的api.py脚本不支持多线程", "id": 228}
{"title": "你好请教一下。api.py部署的时候能不能提供给后端流失返回结果。", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/507", "detail": "curl -X 'POST' \\", "id": 229}
{"title": "流式输出流式接口使用server-sent events技术。", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/508", "detail": "想这样一样https://blog.csdn.net/weixin_43228814/article/details/130063010", "id": 230}
{"title": "计划增加流式输出功能吗ChatGLM模型通过api方式调用响应时间慢怎么破Fastapi流式接口来解惑能快速提升响应速度", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/509", "detail": "**问题描述 / Problem Description**", "id": 231}
{"title": "[BUG] 知识库上传时发生ERROR (could not open xxx for reading: No such file or directory)", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/510", "detail": "**问题描述 / Problem Description**", "id": 232}
{"title": "api.py脚本打算增加SSE流式输出吗", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/511", "detail": "curl调用的时候可以检测第一个字从而提升回复的体验", "id": 233}
{"title": "[BUG] 使用tornado实现webSocket可以多个客户端同时连接并且实现流式回复但是多个客户端同时使用答案就很乱是模型不支持多线程吗", "file": "2023-05-31.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/512", "detail": "import asyncio", "id": 234}
{"title": "支持 chinese_alpaca_plus_lora 吗 基于llama的", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/514", "detail": "支持 chinese_alpaca_plus_lora 吗 基于llama的https://github.com/ymcui/Chinese-LLaMA-Alpaca这个项目的", "id": 235}
{"title": "[BUG] 现在能读图片的pdf了但是文字的pdf反而读不了了什么情况", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/515", "detail": "**问题描述 / Problem Description**", "id": 236}
{"title": "在推理的过程中卡住不动,进程无法正常结束", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/516", "detail": "**问题描述 / Problem Description**", "id": 237}
{"title": "curl调用的时候从第二轮开始curl如何传参可以实现多轮对话", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/517", "detail": "第一轮调用:", "id": 238}
{"title": "建议添加api.py部署后的日志管理功能", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/518", "detail": "-------------------------------------------------------------------------------", "id": 239}
{"title": "有大佬知道怎么多线程部署api.py脚本吗", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/519", "detail": "api.py部署后使用下面的请求时间较慢好像是单线程如何改为多线程部署api.py", "id": 240}
{"title": "[BUG] 上传文件到知识库 任何格式与内容都永远失败", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/520", "detail": "上传知识库的时候传txt无法解析就算是穿content/sample里的样例txt也无法解析上传md、pdf等都无法加载会持续性等待等到了超过30分钟也不行。", "id": 241}
{"title": "关于prompt_template的问题", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/521", "detail": "请问这段prompt_template是什么意思要怎么使用可以给一个具体模板参考下吗", "id": 242}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-06-01.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/522", "detail": "**问题描述 / Problem Description**", "id": 243}
{"title": "中文分词句号处理(关于表达金额之间的\".\"", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/523", "detail": "建议处理12.6亿元的这样的分词最好别分成12 和6亿这样的需要放到一起", "id": 244}
{"title": "ImportError: cannot import name 'inference' from 'paddle'", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/526", "detail": "在网上找了一圈有说升级paddle的我做了还是没有用有说安装paddlepaddle的我找了豆瓣的镜像源但安装报错cannot detect archive format", "id": 245}
{"title": "[BUG] webscoket 接口串行问题(/local_doc_qa/stream-chat/{knowledge_base_id}", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/527", "detail": "**问题描述 / Problem Description**", "id": 246}
{"title": "[FEATURE] 刷新页面更新知识库列表", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/528", "detail": "**功能描述以及改进方案**", "id": 247}
{"title": "[BUG] 使用ptuning微调模型后问答效果并不好", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/530", "detail": "### 未调用ptuning", "id": 248}
{"title": "[BUG] 多轮对话效果不佳", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/532", "detail": "在进行多轮对话的时候无论设置的history_len是多少效果都不好。事实上我将其设置成了最大值10但在对话中仍然无法实现多轮对话", "id": 249}
{"title": "RuntimeError: MPS backend out of memory (MPS allocated: 18.00 GB, other allocations: 4.87 MB, max allowed: 18.13 GB)", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/533", "detail": "**问题描述**", "id": 250}
{"title": " 请大家重视这个issue真正使用肯定是多用户并发问答希望增加此功能", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/534", "detail": "这得看你有多少显卡", "id": 251}
{"title": "在启动项目的时候如何使用到多张gpu啊", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/535", "detail": "**在启动项目的时候如何使用到多张gpu啊**", "id": 252}
{"title": " 使用流式输出的时候curl调用的格式是什么", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/536", "detail": "app.websocket(\"/local_doc_qa/stream-chat/{knowledge_base_id}\")(stream_chat)中的knowledge_base_id应该填什么", "id": 253}
{"title": "使用本地 vicuna-7b模型启动错误", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/538", "detail": "环境: ubuntu 22.04 cuda 12.1 没有安装nccl使用rtx2080与m60显卡并行计算", "id": 254}
{"title": "为什么会不调用GPU直接调用CPU呢", "file": "2023-06-02.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/539", "detail": "我的阿里云配置是16G显存用默认代码跑webui.py时提示", "id": 255}
{"title": "上传多个文件时会互相覆盖", "file": "2023-06-03.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/541", "detail": "1、在同一个知识库中上传多个文件时会互相覆盖无法结合多个文档的知识有大佬知道怎么解决吗", "id": 256}
{"title": "[BUG] gcc不是内部或外部命令/LLM对话只能持续一轮", "file": "2023-06-03.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/542", "detail": "No compiled kernel found.", "id": 257}
{"title": "以API模式启动项目却没有知识库的接口列表", "file": "2023-06-04.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/544", "detail": "请问如何获取知识库的接口列表?如果没有需要自行编写的话,可不可以提供相关的获取方式,感谢", "id": 258}
{"title": "程序以API模式启动的时候如何才能让接口以stream模式被调用呢", "file": "2023-06-05.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/546", "detail": "作者您好我在以API模式进行程序启动后我发现接口响应时间很长怎么样才能让接口以stream模式被调用呢我想实现像webui模式的回答那样", "id": 259}
{"title": "关于原文中表格转为文本后数据相关度问题。", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/547", "detail": "原文中表格数据转换为文本,以 X-Y... 的格式每一行组织成一句话,但这样做后发现相关度较低,效果很差,有何好的方案吗?", "id": 260}
{"title": "启动后LLM和知识库问答模式均只有最后一轮记录", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/548", "detail": "拉取最新代码,问答时,每次页面只显示最后一次问答记录,需要修改什么参数才可以保留历史记录?", "id": 261}
{"title": "提供system message配置以便于让回答不要超出知识库范围", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/549", "detail": "**功能描述 / Feature Description**", "id": 262}
{"title": "[BUG] 使用p-tunningv2报错", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/551", "detail": "按照readme的指示把p-tunningv2训练完后的文件放到了p-tunningv2文件夹下勾选使用p-tuningv2点重新加载模型控制台提示错误信息", "id": 263}
{"title": "[BUG] 智障,这么多问题,也好意思放出来,浪费时间", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/553", "detail": "。。。", "id": 264}
{"title": "[FEATURE] 我看代码文件中有一个ali_text_splitter.py为什么不用他这个文本分割器了", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/554", "detail": "我看代码文件中有一个ali_text_splitter.py为什么不用他这个文本分割器了", "id": 265}
{"title": "加载文档函数报错", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/557", "detail": "def load_file(filepath, sentence_size=SENTENCE_SIZE):", "id": 266}
{"title": "参考指引安装docker后运行cli_demo.py提示killed", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/558", "detail": "root@b3d1bd08095c:/chatGLM# python3 cli_demo.py", "id": 267}
{"title": "注意:如果安装错误,注意这两个包的版本 wandb==0.11.0 protobuf==3.18.3", "file": "2023-06-06.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/559", "detail": "Error1: 如果启动异常报错 `protobuf` 需要更新到 `protobuf==3.18.3 `", "id": 268}
{"title": "知识库对长文的知识相关度匹配不太理想有何优化方向", "file": "2023-06-07.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/563", "detail": "我们可能录入一个文章有 1W 字,里面涉及这个文章主题的很多角度问题,我们针对他提问,他相关度匹配的内容和实际我们需要的答案相差很大怎么办。", "id": 269}
{"title": "使用stream-chat函数进行流式输出的时候能使用curl调用吗", "file": "2023-06-07.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/565", "detail": "为什么下面这样调用会报错???", "id": 270}
{"title": "有大佬实践过 并行 或者 多线程 的部署方案吗?", "file": "2023-06-07.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/566", "detail": "+1", "id": 271}
{"title": "多线程部署遇到问题?", "file": "2023-06-07.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/567", "detail": "<img width=\"615\" alt=\"3d87bf74f0cf1a4820cc9e46b245859\" src=\"https://github.com/imClumsyPanda/langchain-ChatGLM/assets/109277248/8787570d-88bd-434e-aaa4-cb9276d1aa50\">", "id": 272}
{"title": "[BUG] 用fastchat加载vicuna-13b模型进行知识库的问答有token的限制错误", "file": "2023-06-07.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/569", "detail": "当我开启fastchat的vicuna-13b的api服务然后config那里配置好(api本地测试过可以返回结果),然后知识库加载好之后(知识库大概有1000多个文档用chatGLM可以正常推理)进行问答时出现token超过限制就问了一句hello", "id": 273}
{"title": "现在的添加知识库,文件多了总是报错,也不知道自己加载了哪些文件,报错后也不知道是全部失败还是一部分成功;希望能有个加载指定文件夹作为知识库的功能", "file": "2023-06-07.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/574", "detail": "**功能描述 / Feature Description**", "id": 274}
{"title": "[BUG] moss模型本地加载报错", "file": "2023-06-08.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/577", "detail": "moss模型本地加载报错", "id": 275}
{"title": "加载本地moss模型报错Can't instantiate abstract class MOSSLLM with abstract methods _history_len", "file": "2023-06-08.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/578", "detail": "(vicuna) ps@ps[13:56:20]:/data/chat/langchain-ChatGLM2/langchain-ChatGLM-0.1.13$ python webui.py --model-dir local_models --model moss --no-remote-model", "id": 276}
{"title": "[FEATURE] 能增加在前端页面控制prompt_template吗或是能支持前端页面选择使用哪个prompt", "file": "2023-06-08.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/579", "detail": "目前只能在config里修改一个prompt想在多个不同场景切换比较麻烦", "id": 277}
{"title": "[BUG] streamlit ui的bug在增加知识库时会报错", "file": "2023-06-08.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/580", "detail": "**问题描述 / Problem Description**", "id": 278}
{"title": "[FEATURE] webui/webui_st可以支持history吗目前仅能一次对话", "file": "2023-06-08.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/581", "detail": "试了下webui和webui_st都不支持历史对话啊只能对话一次不能默认开启所有history吗", "id": 279}
{"title": "启动python cli_demo.py --model chatglm-6b-int4-qe报错", "file": "2023-06-09.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/585", "detail": "下载好模型,和相关依赖环境,之间运行`python cli_demo.py --model chatglm-6b-int4-qe`报错了:", "id": 280}
{"title": "重新构建知识库报错", "file": "2023-06-09.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/586", "detail": "**问题描述 / Problem Description**", "id": 281}
{"title": "[FEATURE] 能否屏蔽paddle我不需要OCR效果差依赖环境还很复杂", "file": "2023-06-09.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/587", "detail": "希望能不依赖paddle", "id": 282}
{"title": "question :文档向量化这个可以自己手动实现么?", "file": "2023-06-09.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/589", "detail": "现有公司级数据500G+,需要使用这个功能,请问如何手动实现这个向量化,然后并加载", "id": 283}
{"title": "view前端能进行流式的返回吗", "file": "2023-06-09.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/590", "detail": "view前端能进行流式的返回吗", "id": 284}
{"title": "[BUG] Load parallel cpu kernel failed, using default cpu kernel code", "file": "2023-06-11.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/594", "detail": "**问题描述 / Problem Description**", "id": 285}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-06-11.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/595", "detail": "**问题描述 / Problem Description**", "id": 286}
{"title": "我在上传本地知识库时提示KeyError: 'name'错误,本地知识库都是.txt文件文件数量大约是2000+。", "file": "2023-06-12.05", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/597", "detail": "<img width=\"649\" alt=\"KError\" src=\"https://github.com/imClumsyPanda/langchain-ChatGLM/assets/59411575/1ecc8182-aeee-4a0a-bbc3-74c2f1373f2d\">", "id": 287}
{"title": "model_config.py中有vicuna-13b-hf模型的配置信息但是好像还是不可用", "file": "2023-06-12.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/600", "detail": "@dongyihua543", "id": 288}
{"title": "ImportError: Using SOCKS proxy, but the 'socksio' package is not installed. Make sure to install httpx using `pip install httpx[socks]`.", "file": "2023-06-12.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/605", "detail": "应该代理问题,但是尝试了好多方法都解决不了,", "id": 289}
{"title": "[BUG] similarity_search_with_score_by_vector在找不到匹配的情况下出错", "file": "2023-06-12.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/607", "detail": "在设置匹配阈值 VECTOR_SEARCH_SCORE_THRESHOLD 的情况下vectorstore会返回空此时上述处理函数会出错", "id": 290}
{"title": "[FEATURE] 请问如何搭建英文知识库呢", "file": "2023-06-12.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/609", "detail": "**功能描述 / Feature Description**", "id": 291}
{"title": "谁有vicuna权重llama转换之后的", "file": "2023-06-13.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/611", "detail": "**问题描述 / Problem Description**", "id": 292}
{"title": "[FEATURE] API能实现上传文件夹的功能么", "file": "2023-06-13.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/612", "detail": "用户懒得全选所有的文件就想上传个文件夹请问下API能实现这个功能么", "id": 293}
{"title": "请问在多卡部署后,上传单个文件作为知识库,用的是单卡在生成向量还是多卡?", "file": "2023-06-13.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/614", "detail": "目前我检测我本地多卡部署的,好像生成知识库向量的时候用的还是单卡", "id": 294}
{"title": "[BUG] python webui.py提示非法指令", "file": "2023-06-13.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/615", "detail": "(/data/conda-langchain [root@chatglm langchain-ChatGLM]# python webui.py", "id": 295}
{"title": "知识库文件跨行切分问题", "file": "2023-06-13.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/616", "detail": "我的知识库文件txt文件是一行一条知识用\\n分行。", "id": 296}
{"title": "[FEATURE] bing搜索问答有流式的API么", "file": "2023-06-13.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/617", "detail": "web端是有这个bing搜索回答但api接口没有发现大佬能给个提示么", "id": 297}
{"title": "希望出一个macos m2的安装教程", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/620", "detail": "mac m2安装模型加载成功了知识库文件也上传成功了但是一问答就会报错报错内容如下", "id": 298}
{"title": "为【出处】提供高亮显示", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/621", "detail": "具体出处里面,对相关的内容高亮显示,不包含前后文。", "id": 299}
{"title": "[BUG] CPU运行cli_demo.py不回答hang住", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/622", "detail": "没有GPU32G内存的ubuntu机器。", "id": 300}
{"title": "关于删除知识库里面的文档后LLM知识库对话的时候还是会返回该被删除文档的内容", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/623", "detail": "如题在vue前端成功执行删除知识库里面文档A.txt后未能也在faiss索引中也删除该文档LLM还是会返回这个A.txt的内容并且以A.txt为出处未能达到删除的效果", "id": 301}
{"title": "[BUG] 调用知识库进行问答,显存会一直叠加", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/625", "detail": "14G的显存,调用的chatglm-6b-int8模型,进行知识库问答时,最多问答四次就会爆显存了,观察了一下显存使用情况,每一次使用就会增加一次显存,请问这样是正常的吗?是否有什么配置需要开启可以解决这个问题?例如进行一次知识库问答清空上次问题的显存?", "id": 302}
{"title": "[BUG] web页面 重新构建数据库 失败,导致 原来的上传的数据库都没了", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/626", "detail": "web页面 重新构建数据库 失败,导致 原来的上传的数据库都没了", "id": 303}
{"title": "在CPU上运行webui.py报错Tensor on device cpu is not on the expected device meta!", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/627", "detail": "在CPU上运行python webui.py能启动但最后有RuntimeError: Tensor on device cpu is not on the expected device meta!", "id": 304}
{"title": "OSError: [WinError 1114] 动态链接库(DLL)初始化例程失败。 Error loading \"E:\\xxx\\envs\\langchain\\lib\\site-packages\\torch\\lib\\caffe2_nvrtc.dll\" or one of its dependencies.哪位大佬知道如何解决吗?", "file": "2023-06-14.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/629", "detail": "**问题描述 / Problem Description**", "id": 305}
{"title": "[BUG] WEBUI删除知识库文档会导致知识库问答失败", "file": "2023-06-15.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/632", "detail": "如题,从知识库已有文件中选择要删除的文件,点击删除后,在问答框输入内容回车报错", "id": 306}
{"title": "更新后的版本中删除知识库中的文件再提问出现error错误", "file": "2023-06-15.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/634", "detail": "针对更新版本,识别到一个问题,过程如下:", "id": 307}
{"title": "我配置好了环境,想要实现本地知识库的问答?可是它返回给我的", "file": "2023-06-15.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/637", "detail": "没有总结,只有相关度的回复,但是我看演示里面表现的,回复是可以实现总结的,我去查询代码", "id": 308}
{"title": "[BUG] NPM run dev can not successfully start the VUE frontend", "file": "2023-06-15.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/638", "detail": "**问题描述 / Problem Description**", "id": 309}
{"title": "[BUG] 简洁阐述问题 / Concise description of the issue", "file": "2023-06-15.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/639", "detail": "**问题描述 / Problem Description**", "id": 310}
{"title": "提一个模型加载的bug我在截图中修复了你们有空可以看一下。", "file": "2023-06-15.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/642", "detail": "![model_load_bug](https://github.com/imClumsyPanda/langchain-ChatGLM/assets/59411575/4432adc4-ccdd-45d9-aafc-5f2d1963403b)", "id": 311}
{"title": "[求助]关于设置embedding model路径的问题", "file": "2023-06-16.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/643", "detail": "如题,我之前成功跑起来过一次,但因环境丢失重新配置 再运行webui就总是报错", "id": 312}
{"title": "Lora微调后的模型可以直接使用吗", "file": "2023-06-16.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/646", "detail": "看model_config.py里是有USE_LORA这个参数的但是在cli_demo.py和webui.py这两个里面都没有用到实际测试下来模型没有微调的效果想问问现在这个功能实现了吗", "id": 313}
{"title": "write_check_file在tmp_files目录下生成的load_file.txt是否需要一直保留占用空间很大在建完索引后能否删除", "file": "2023-06-16.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/647", "detail": "**功能描述 / Feature Description**", "id": 314}
{"title": "[BUG] /local_doc_qa/list_files?knowledge_base_id=test删除知识库bug", "file": "2023-06-16.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/649", "detail": "1.新建test知识库并上传文件在vue前端完成并检查后端发现确实生成了test文件夹以及下面的content和vec_store", "id": 315}
{"title": "[BUG] vue webui无法加载知识库", "file": "2023-06-16.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/650", "detail": "拉取了最新的代码分别运行了后端api和前端web点击知识库始终只能显示simple无法加载知识库", "id": 316}
{"title": "不能本地加载moss模型吗", "file": "2023-06-16.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/652", "detail": "手动下载模型设置local_model_path路径依旧提示缺少文件该如何正确配置", "id": 317}
{"title": "macos m2 pro docker 安装失败", "file": "2023-06-17.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/654", "detail": "macos m2 pro docker 安装失败", "id": 318}
{"title": " [BUG] mac m1 pro 运行提示 zsh: segmentation fault", "file": "2023-06-17.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/655", "detail": "运行: python webui.py", "id": 319}
{"title": "安装 requirements 报错", "file": "2023-06-17.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/656", "detail": "(langchainchatglm) D:\\github\\langchain-ChatGLM>pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple/", "id": 320}
{"title": "[BUG] AssertionError", "file": "2023-06-17.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/658", "detail": "**问题描述 / Problem Description**", "id": 321}
{"title": "[FEATURE] 支持AMD win10 本地部署吗?", "file": "2023-06-18.06", "url": "https://github.com/imClumsyPanda/langchain-ChatGLM/issues/660", "detail": "**功能描述 / Feature Description**", "id": 322}