镜像自地址
https://github.com/binary-husky/gpt_academic.git
已同步 2025-12-08 23:46:48 +00:00
re-format code to with pre-commit
这个提交包含在:
@@ -32,4 +32,4 @@ P.S. 如果您按照以下步骤成功接入了新的大模型,欢迎发Pull R
|
||||
|
||||
5. 测试通过后,在`request_llms/bridge_all.py`中做最后的修改,把你的模型完全接入到框架中(聪慧如您,只需要看一眼该文件就明白怎么修改了)
|
||||
|
||||
6. 修改`LLM_MODEL`配置,然后运行`python main.py`,测试最后的效果
|
||||
6. 修改`LLM_MODEL`配置,然后运行`python main.py`,测试最后的效果
|
||||
|
||||
@@ -2,4 +2,4 @@ protobuf
|
||||
cpm_kernels
|
||||
torch>=1.10
|
||||
mdtex2html
|
||||
sentencepiece
|
||||
sentencepiece
|
||||
|
||||
@@ -3,4 +3,4 @@ jtorch >= 0.1.3
|
||||
torch
|
||||
torchvision
|
||||
pandas
|
||||
jieba
|
||||
jieba
|
||||
|
||||
@@ -5,4 +5,3 @@ accelerate
|
||||
matplotlib
|
||||
huggingface_hub
|
||||
triton
|
||||
|
||||
|
||||
@@ -1 +1 @@
|
||||
dashscope
|
||||
dashscope
|
||||
|
||||
@@ -2,4 +2,4 @@ modelscope
|
||||
transformers_stream_generator
|
||||
auto-gptq
|
||||
optimum
|
||||
urllib3<2
|
||||
urllib3<2
|
||||
|
||||
@@ -1 +1 @@
|
||||
slack-sdk==3.21.3
|
||||
slack-sdk==3.21.3
|
||||
|
||||
在新工单中引用
屏蔽一个用户