镜像自地址
https://github.com/binary-husky/gpt_academic.git
已同步 2025-12-06 22:46:48 +00:00
比较提交
16 次代码提交
version3.3
...
version3.3
| 作者 | SHA1 | 提交日期 | |
|---|---|---|---|
|
|
6538c58b8e | ||
|
|
e35eb9048e | ||
|
|
a0fa64de47 | ||
|
|
e04946c816 | ||
|
|
231c9c2e57 | ||
|
|
48555f570c | ||
|
|
7c9195ddd2 | ||
|
|
5500fbe682 | ||
|
|
5a83b3b096 | ||
|
|
4783fd6f37 | ||
|
|
9a4b56277c | ||
|
|
5eea959103 | ||
|
|
856df8fb62 | ||
|
|
8e59412c47 | ||
|
|
8f571ff68f | ||
|
|
e4c4b28ddf |
102
README.md
102
README.md
@@ -7,7 +7,7 @@
|
|||||||
|
|
||||||
# <img src="docs/logo.png" width="40" > ChatGPT 学术优化
|
# <img src="docs/logo.png" width="40" > ChatGPT 学术优化
|
||||||
|
|
||||||
**如果喜欢这个项目,请给它一个Star;如果你发明了更好用的快捷键或函数插件,欢迎发issue或者pull requests**
|
**如果喜欢这个项目,请给它一个Star;如果你发明了更好用的快捷键或函数插件,欢迎发pull requests**
|
||||||
|
|
||||||
If you like this project, please give it a Star. If you've come up with more useful academic shortcuts or functional plugins, feel free to open an issue or pull request. We also have a README in [English|](docs/README_EN.md)[日本語|](docs/README_JP.md)[Русский|](docs/README_RS.md)[Français](docs/README_FR.md) translated by this project itself.
|
If you like this project, please give it a Star. If you've come up with more useful academic shortcuts or functional plugins, feel free to open an issue or pull request. We also have a README in [English|](docs/README_EN.md)[日本語|](docs/README_JP.md)[Русский|](docs/README_RS.md)[Français](docs/README_FR.md) translated by this project itself.
|
||||||
|
|
||||||
@@ -27,7 +27,6 @@ If you like this project, please give it a Star. If you've come up with more use
|
|||||||
一键中英互译 | 一键中英互译
|
一键中英互译 | 一键中英互译
|
||||||
一键代码解释 | 显示代码、解释代码、生成代码、给代码加注释
|
一键代码解释 | 显示代码、解释代码、生成代码、给代码加注释
|
||||||
[自定义快捷键](https://www.bilibili.com/video/BV14s4y1E7jN) | 支持自定义快捷键
|
[自定义快捷键](https://www.bilibili.com/video/BV14s4y1E7jN) | 支持自定义快捷键
|
||||||
[配置代理服务器](https://www.bilibili.com/video/BV1rc411W7Dr) | 支持代理连接OpenAI/Google等,秒解锁ChatGPT互联网[实时信息聚合](https://www.bilibili.com/video/BV1om4y127ck/)能力
|
|
||||||
模块化设计 | 支持自定义强大的[函数插件](https://github.com/binary-husky/chatgpt_academic/tree/master/crazy_functions),插件支持[热更新](https://github.com/binary-husky/chatgpt_academic/wiki/%E5%87%BD%E6%95%B0%E6%8F%92%E4%BB%B6%E6%8C%87%E5%8D%97)
|
模块化设计 | 支持自定义强大的[函数插件](https://github.com/binary-husky/chatgpt_academic/tree/master/crazy_functions),插件支持[热更新](https://github.com/binary-husky/chatgpt_academic/wiki/%E5%87%BD%E6%95%B0%E6%8F%92%E4%BB%B6%E6%8C%87%E5%8D%97)
|
||||||
[自我程序剖析](https://www.bilibili.com/video/BV1cj411A7VW) | [函数插件] [一键读懂](https://github.com/binary-husky/chatgpt_academic/wiki/chatgpt-academic%E9%A1%B9%E7%9B%AE%E8%87%AA%E8%AF%91%E8%A7%A3%E6%8A%A5%E5%91%8A)本项目的源代码
|
[自我程序剖析](https://www.bilibili.com/video/BV1cj411A7VW) | [函数插件] [一键读懂](https://github.com/binary-husky/chatgpt_academic/wiki/chatgpt-academic%E9%A1%B9%E7%9B%AE%E8%87%AA%E8%AF%91%E8%A7%A3%E6%8A%A5%E5%91%8A)本项目的源代码
|
||||||
[程序剖析](https://www.bilibili.com/video/BV1cj411A7VW) | [函数插件] 一键可以剖析其他Python/C/C++/Java/Lua/...项目树
|
[程序剖析](https://www.bilibili.com/video/BV1cj411A7VW) | [函数插件] 一键可以剖析其他Python/C/C++/Java/Lua/...项目树
|
||||||
@@ -45,7 +44,6 @@ chat分析报告生成 | [函数插件] 运行后自动生成总结汇报
|
|||||||
启动暗色gradio[主题](https://github.com/binary-husky/chatgpt_academic/issues/173) | 在浏览器url后面添加```/?__dark-theme=true```可以切换dark主题
|
启动暗色gradio[主题](https://github.com/binary-husky/chatgpt_academic/issues/173) | 在浏览器url后面添加```/?__dark-theme=true```可以切换dark主题
|
||||||
[多LLM模型](https://www.bilibili.com/video/BV1wT411p7yf)支持,[API2D](https://api2d.com/)接口支持 | 同时被GPT3.5、GPT4和[清华ChatGLM](https://github.com/THUDM/ChatGLM-6B)伺候的感觉一定会很不错吧?
|
[多LLM模型](https://www.bilibili.com/video/BV1wT411p7yf)支持,[API2D](https://api2d.com/)接口支持 | 同时被GPT3.5、GPT4和[清华ChatGLM](https://github.com/THUDM/ChatGLM-6B)伺候的感觉一定会很不错吧?
|
||||||
更多LLM模型接入 | 新加入Newbing测试接口(新必应AI)
|
更多LLM模型接入 | 新加入Newbing测试接口(新必应AI)
|
||||||
huggingface免科学上网[在线体验](https://huggingface.co/spaces/qingxu98/gpt-academic) | 登陆huggingface后复制[此空间](https://huggingface.co/spaces/qingxu98/gpt-academic)
|
|
||||||
…… | ……
|
…… | ……
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
@@ -82,9 +80,6 @@ huggingface免科学上网[在线体验](https://huggingface.co/spaces/qingxu98/
|
|||||||
<img src="https://user-images.githubusercontent.com/96192199/232537274-deca0563-7aa6-4b5d-94a2-b7c453c47794.png" width="700" >
|
<img src="https://user-images.githubusercontent.com/96192199/232537274-deca0563-7aa6-4b5d-94a2-b7c453c47794.png" width="700" >
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
多种大语言模型混合调用[huggingface测试版](https://huggingface.co/spaces/qingxu98/academic-chatgpt-beta)(huggingface版不支持chatglm)
|
|
||||||
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 安装-方法1:直接运行 (Windows, Linux or MacOS)
|
## 安装-方法1:直接运行 (Windows, Linux or MacOS)
|
||||||
@@ -95,14 +90,10 @@ git clone https://github.com/binary-husky/chatgpt_academic.git
|
|||||||
cd chatgpt_academic
|
cd chatgpt_academic
|
||||||
```
|
```
|
||||||
|
|
||||||
2. 配置API_KEY和代理设置
|
2. 配置API_KEY
|
||||||
|
|
||||||
|
在`config.py`中,配置API KEY等[设置](https://github.com/binary-husky/gpt_academic/issues/1) 。
|
||||||
|
|
||||||
在`config.py`中,配置 海外Proxy 和 OpenAI API KEY,说明如下
|
|
||||||
```
|
|
||||||
1. 如果你在国内,需要设置海外代理才能够顺利使用OpenAI API,设置方法请仔细阅读config.py(1.修改其中的USE_PROXY为True; 2.按照说明修改其中的proxies)。
|
|
||||||
2. 配置 OpenAI API KEY。支持任意数量的OpenAI的密钥和API2D的密钥共存/负载均衡,多个KEY用英文逗号分隔即可,例如输入 API_KEY="OpenAI密钥1,API2D密钥2,OpenAI密钥3,OpenAI密钥4"
|
|
||||||
3. 与代理网络有关的issue(网络超时、代理不起作用)汇总到 https://github.com/binary-husky/chatgpt_academic/issues/1
|
|
||||||
```
|
|
||||||
(P.S. 程序运行时会优先检查是否存在名为`config_private.py`的私密配置文件,并用其中的配置覆盖`config.py`的同名配置。因此,如果您能理解我们的配置读取逻辑,我们强烈建议您在`config.py`旁边创建一个名为`config_private.py`的新配置文件,并把`config.py`中的配置转移(复制)到`config_private.py`中。`config_private.py`不受git管控,可以让您的隐私信息更加安全。)
|
(P.S. 程序运行时会优先检查是否存在名为`config_private.py`的私密配置文件,并用其中的配置覆盖`config.py`的同名配置。因此,如果您能理解我们的配置读取逻辑,我们强烈建议您在`config.py`旁边创建一个名为`config_private.py`的新配置文件,并把`config.py`中的配置转移(复制)到`config_private.py`中。`config_private.py`不受git管控,可以让您的隐私信息更加安全。)
|
||||||
|
|
||||||
|
|
||||||
@@ -130,14 +121,8 @@ python main.py
|
|||||||
|
|
||||||
5. 测试函数插件
|
5. 测试函数插件
|
||||||
```
|
```
|
||||||
- 测试Python项目分析
|
|
||||||
(选择1)input区域 输入 `./crazy_functions/test_project/python/dqn` , 然后点击 "解析整个Python项目"
|
|
||||||
(选择2)展开文件上传区,将python文件/包含python文件的压缩包拖拽进去,在出现反馈提示后, 然后点击 "解析整个Python项目"
|
|
||||||
- 测试自我代码解读(本项目自译解)
|
|
||||||
点击 "[多线程Demo] 解析此项目本身(源码自译解)"
|
|
||||||
- 测试函数插件模板函数(要求gpt回答历史上的今天发生了什么),您可以根据此函数为模板,实现更复杂的功能
|
- 测试函数插件模板函数(要求gpt回答历史上的今天发生了什么),您可以根据此函数为模板,实现更复杂的功能
|
||||||
点击 "[函数插件模板Demo] 历史上的今天"
|
点击 "[函数插件模板Demo] 历史上的今天"
|
||||||
- 函数插件区下拉菜单中有更多功能可供选择
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## 安装-方法2:使用Docker
|
## 安装-方法2:使用Docker
|
||||||
@@ -148,7 +133,7 @@ python main.py
|
|||||||
# 下载项目
|
# 下载项目
|
||||||
git clone https://github.com/binary-husky/chatgpt_academic.git
|
git clone https://github.com/binary-husky/chatgpt_academic.git
|
||||||
cd chatgpt_academic
|
cd chatgpt_academic
|
||||||
# 配置 “海外Proxy”, “API_KEY” 以及 “WEB_PORT” (例如50923) 等
|
# 配置 “Proxy”, “API_KEY” 以及 “WEB_PORT” (例如50923) 等
|
||||||
用任意文本编辑器编辑 config.py
|
用任意文本编辑器编辑 config.py
|
||||||
# 安装
|
# 安装
|
||||||
docker build -t gpt-academic .
|
docker build -t gpt-academic .
|
||||||
@@ -171,7 +156,6 @@ docker run --rm -it --net=host --gpus=all gpt-academic
|
|||||||
docker run --rm -it --net=host --gpus=all gpt-academic bash
|
docker run --rm -it --net=host --gpus=all gpt-academic bash
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
## 安装-方法3:其他部署方式(需要云服务器知识与经验)
|
## 安装-方法3:其他部署方式(需要云服务器知识与经验)
|
||||||
|
|
||||||
1. 远程云服务器部署
|
1. 远程云服务器部署
|
||||||
@@ -183,14 +167,6 @@ docker run --rm -it --net=host --gpus=all gpt-academic bash
|
|||||||
3. 如何在二级网址(如`http://localhost/subpath`)下运行
|
3. 如何在二级网址(如`http://localhost/subpath`)下运行
|
||||||
请访问[FastAPI运行说明](docs/WithFastapi.md)
|
请访问[FastAPI运行说明](docs/WithFastapi.md)
|
||||||
|
|
||||||
## 安装-代理配置
|
|
||||||
1. 常规方法
|
|
||||||
[配置代理](https://github.com/binary-husky/chatgpt_academic/issues/1)
|
|
||||||
|
|
||||||
2. 纯新手教程
|
|
||||||
[纯新手教程](https://github.com/binary-husky/chatgpt_academic/wiki/%E4%BB%A3%E7%90%86%E8%BD%AF%E4%BB%B6%E9%97%AE%E9%A2%98%E7%9A%84%E6%96%B0%E6%89%8B%E8%A7%A3%E5%86%B3%E6%96%B9%E6%B3%95%EF%BC%88%E6%96%B9%E6%B3%95%E5%8F%AA%E9%80%82%E7%94%A8%E4%BA%8E%E6%96%B0%E6%89%8B%EF%BC%89)
|
|
||||||
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 自定义新的便捷按钮 / 自定义函数插件
|
## 自定义新的便捷按钮 / 自定义函数插件
|
||||||
@@ -218,73 +194,8 @@ docker run --rm -it --net=host --gpus=all gpt-academic bash
|
|||||||
详情请参考[函数插件指南](https://github.com/binary-husky/chatgpt_academic/wiki/%E5%87%BD%E6%95%B0%E6%8F%92%E4%BB%B6%E6%8C%87%E5%8D%97)。
|
详情请参考[函数插件指南](https://github.com/binary-husky/chatgpt_academic/wiki/%E5%87%BD%E6%95%B0%E6%8F%92%E4%BB%B6%E6%8C%87%E5%8D%97)。
|
||||||
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
|
## 版本:
|
||||||
## 部分功能展示
|
|
||||||
|
|
||||||
1. 图片显示:
|
|
||||||
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/228737599-bf0a9d9c-1808-4f43-ae15-dfcc7af0f295.png" width="800" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
2. 本项目的代码自译解(如果一个程序能够读懂并剖析自己):
|
|
||||||
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/226936850-c77d7183-0749-4c1c-9875-fd4891842d0c.png" width="800" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/226936618-9b487e4b-ab5b-4b6e-84c6-16942102e917.png" width="800" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
3. 其他任意Python/Cpp/Java/Go/Rect/...项目剖析:
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/226935232-6b6a73ce-8900-4aee-93f9-733c7e6fef53.png" width="800" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/226969067-968a27c1-1b9c-486b-8b81-ab2de8d3f88a.png" width="800" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
4. Latex论文一键阅读理解与摘要生成
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/227504406-86ab97cd-f208-41c3-8e4a-7000e51cf980.png" width="800" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
5. 自动报告生成
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/227503770-fe29ce2c-53fd-47b0-b0ff-93805f0c2ff4.png" height="300" >
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/227504617-7a497bb3-0a2a-4b50-9a8a-95ae60ea7afd.png" height="300" >
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/227504005-efeaefe0-b687-49d0-bf95-2d7b7e66c348.png" height="300" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
6. 模块化功能设计
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/229288270-093643c1-0018-487a-81e6-1d7809b6e90f.png" height="400" >
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/227504931-19955f78-45cd-4d1c-adac-e71e50957915.png" height="400" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
|
|
||||||
7. 源代码转译英文
|
|
||||||
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/229720562-fe6c3508-6142-4635-a83d-21eb3669baee.png" height="400" >
|
|
||||||
</div>
|
|
||||||
|
|
||||||
8. 互联网在线信息综合
|
|
||||||
|
|
||||||
<div align="center">
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/233575247-fb00819e-6d1b-4bb7-bd54-1d7528f03dd9.png" width="800" >
|
|
||||||
<img src="https://user-images.githubusercontent.com/96192199/233779501-5ce826f0-6cca-4d59-9e5f-b4eacb8cc15f.png" width="800" >
|
|
||||||
|
|
||||||
</div>
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## Todo 与 版本规划:
|
|
||||||
- version 3.3+ (todo): NewBing支持
|
|
||||||
- version 3.2: 函数插件支持更多参数接口 (保存对话功能, 解读任意语言代码+同时询问任意的LLM组合)
|
- version 3.2: 函数插件支持更多参数接口 (保存对话功能, 解读任意语言代码+同时询问任意的LLM组合)
|
||||||
- version 3.1: 支持同时问询多个gpt模型!支持api2d,支持多个apikey负载均衡
|
- version 3.1: 支持同时问询多个gpt模型!支持api2d,支持多个apikey负载均衡
|
||||||
- version 3.0: 对chatglm和其他小型llm的支持
|
- version 3.0: 对chatglm和其他小型llm的支持
|
||||||
@@ -297,7 +208,6 @@ docker run --rm -it --net=host --gpus=all gpt-academic bash
|
|||||||
- version 2.0: 引入模块化函数插件
|
- version 2.0: 引入模块化函数插件
|
||||||
- version 1.0: 基础功能
|
- version 1.0: 基础功能
|
||||||
|
|
||||||
chatgpt_academic开发者QQ群:734063350
|
|
||||||
|
|
||||||
## 参考与学习
|
## 参考与学习
|
||||||
|
|
||||||
|
|||||||
@@ -21,6 +21,7 @@ def get_crazy_functions():
|
|||||||
from crazy_functions.总结word文档 import 总结word文档
|
from crazy_functions.总结word文档 import 总结word文档
|
||||||
from crazy_functions.解析JupyterNotebook import 解析ipynb文件
|
from crazy_functions.解析JupyterNotebook import 解析ipynb文件
|
||||||
from crazy_functions.对话历史存档 import 对话历史存档
|
from crazy_functions.对话历史存档 import 对话历史存档
|
||||||
|
from crazy_functions.批量Markdown翻译 import Markdown英译中
|
||||||
function_plugins = {
|
function_plugins = {
|
||||||
|
|
||||||
"解析整个Python项目": {
|
"解析整个Python项目": {
|
||||||
@@ -81,8 +82,14 @@ def get_crazy_functions():
|
|||||||
"Color": "stop", # 按钮颜色
|
"Color": "stop", # 按钮颜色
|
||||||
"Function": HotReload(读文章写摘要)
|
"Function": HotReload(读文章写摘要)
|
||||||
},
|
},
|
||||||
|
"Markdown/Readme英译中": {
|
||||||
|
# HotReload 的意思是热更新,修改函数插件代码后,不需要重启程序,代码直接生效
|
||||||
|
"Color": "stop",
|
||||||
|
"Function": HotReload(Markdown英译中)
|
||||||
|
},
|
||||||
"批量生成函数注释": {
|
"批量生成函数注释": {
|
||||||
"Color": "stop", # 按钮颜色
|
"Color": "stop", # 按钮颜色
|
||||||
|
"AsButton": False, # 加入下拉菜单中
|
||||||
"Function": HotReload(批量生成函数注释)
|
"Function": HotReload(批量生成函数注释)
|
||||||
},
|
},
|
||||||
"[多线程Demo] 解析此项目本身(源码自译解)": {
|
"[多线程Demo] 解析此项目本身(源码自译解)": {
|
||||||
@@ -110,7 +117,6 @@ def get_crazy_functions():
|
|||||||
from crazy_functions.Latex全文翻译 import Latex中译英
|
from crazy_functions.Latex全文翻译 import Latex中译英
|
||||||
from crazy_functions.Latex全文翻译 import Latex英译中
|
from crazy_functions.Latex全文翻译 import Latex英译中
|
||||||
from crazy_functions.批量Markdown翻译 import Markdown中译英
|
from crazy_functions.批量Markdown翻译 import Markdown中译英
|
||||||
from crazy_functions.批量Markdown翻译 import Markdown英译中
|
|
||||||
|
|
||||||
function_plugins.update({
|
function_plugins.update({
|
||||||
"批量翻译PDF文档(多线程)": {
|
"批量翻译PDF文档(多线程)": {
|
||||||
@@ -175,12 +181,7 @@ def get_crazy_functions():
|
|||||||
"AsButton": False, # 加入下拉菜单中
|
"AsButton": False, # 加入下拉菜单中
|
||||||
"Function": HotReload(Markdown中译英)
|
"Function": HotReload(Markdown中译英)
|
||||||
},
|
},
|
||||||
"[测试功能] 批量Markdown英译中(输入路径或上传压缩包)": {
|
|
||||||
# HotReload 的意思是热更新,修改函数插件代码后,不需要重启程序,代码直接生效
|
|
||||||
"Color": "stop",
|
|
||||||
"AsButton": False, # 加入下拉菜单中
|
|
||||||
"Function": HotReload(Markdown英译中)
|
|
||||||
},
|
|
||||||
|
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|||||||
@@ -84,7 +84,33 @@ def 多文件翻译(file_manifest, project_folder, llm_kwargs, plugin_kwargs, ch
|
|||||||
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
||||||
|
|
||||||
|
|
||||||
|
def get_files_from_everything(txt):
|
||||||
|
import glob, os
|
||||||
|
|
||||||
|
success = True
|
||||||
|
if txt.startswith('http'):
|
||||||
|
# 网络的远程文件
|
||||||
|
txt = txt.replace("https://github.com/", "https://raw.githubusercontent.com/")
|
||||||
|
txt = txt.replace("/blob/", "/")
|
||||||
|
import requests
|
||||||
|
from toolbox import get_conf
|
||||||
|
proxies, = get_conf('proxies')
|
||||||
|
r = requests.get(txt, proxies=proxies)
|
||||||
|
with open('./gpt_log/temp.md', 'wb+') as f: f.write(r.content)
|
||||||
|
project_folder = './gpt_log/'
|
||||||
|
file_manifest = ['./gpt_log/temp.md']
|
||||||
|
elif txt.endswith('.md'):
|
||||||
|
# 直接给定文件
|
||||||
|
file_manifest = [txt]
|
||||||
|
project_folder = os.path.dirname(txt)
|
||||||
|
elif os.path.exists(txt):
|
||||||
|
# 本地路径,递归搜索
|
||||||
|
project_folder = txt
|
||||||
|
file_manifest = [f for f in glob.glob(f'{project_folder}/**/*.md', recursive=True)]
|
||||||
|
else:
|
||||||
|
success = False
|
||||||
|
|
||||||
|
return success, file_manifest, project_folder
|
||||||
|
|
||||||
|
|
||||||
@CatchException
|
@CatchException
|
||||||
@@ -98,6 +124,7 @@ def Markdown英译中(txt, llm_kwargs, plugin_kwargs, chatbot, history, system_p
|
|||||||
# 尝试导入依赖,如果缺少依赖,则给出安装建议
|
# 尝试导入依赖,如果缺少依赖,则给出安装建议
|
||||||
try:
|
try:
|
||||||
import tiktoken
|
import tiktoken
|
||||||
|
import glob, os
|
||||||
except:
|
except:
|
||||||
report_execption(chatbot, history,
|
report_execption(chatbot, history,
|
||||||
a=f"解析项目: {txt}",
|
a=f"解析项目: {txt}",
|
||||||
@@ -105,19 +132,21 @@ def Markdown英译中(txt, llm_kwargs, plugin_kwargs, chatbot, history, system_p
|
|||||||
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
||||||
return
|
return
|
||||||
history = [] # 清空历史,以免输入溢出
|
history = [] # 清空历史,以免输入溢出
|
||||||
import glob, os
|
|
||||||
if os.path.exists(txt):
|
success, file_manifest, project_folder = get_files_from_everything(txt)
|
||||||
project_folder = txt
|
|
||||||
else:
|
if not success:
|
||||||
|
# 什么都没有
|
||||||
if txt == "": txt = '空空如也的输入栏'
|
if txt == "": txt = '空空如也的输入栏'
|
||||||
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到本地项目或无权访问: {txt}")
|
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到本地项目或无权访问: {txt}")
|
||||||
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
||||||
return
|
return
|
||||||
file_manifest = [f for f in glob.glob(f'{project_folder}/**/*.md', recursive=True)]
|
|
||||||
if len(file_manifest) == 0:
|
if len(file_manifest) == 0:
|
||||||
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到任何.md文件: {txt}")
|
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到任何.md文件: {txt}")
|
||||||
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
||||||
return
|
return
|
||||||
|
|
||||||
yield from 多文件翻译(file_manifest, project_folder, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt, language='en->zh')
|
yield from 多文件翻译(file_manifest, project_folder, llm_kwargs, plugin_kwargs, chatbot, history, system_prompt, language='en->zh')
|
||||||
|
|
||||||
|
|
||||||
@@ -135,6 +164,7 @@ def Markdown中译英(txt, llm_kwargs, plugin_kwargs, chatbot, history, system_p
|
|||||||
# 尝试导入依赖,如果缺少依赖,则给出安装建议
|
# 尝试导入依赖,如果缺少依赖,则给出安装建议
|
||||||
try:
|
try:
|
||||||
import tiktoken
|
import tiktoken
|
||||||
|
import glob, os
|
||||||
except:
|
except:
|
||||||
report_execption(chatbot, history,
|
report_execption(chatbot, history,
|
||||||
a=f"解析项目: {txt}",
|
a=f"解析项目: {txt}",
|
||||||
@@ -142,18 +172,13 @@ def Markdown中译英(txt, llm_kwargs, plugin_kwargs, chatbot, history, system_p
|
|||||||
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
||||||
return
|
return
|
||||||
history = [] # 清空历史,以免输入溢出
|
history = [] # 清空历史,以免输入溢出
|
||||||
import glob, os
|
success, file_manifest, project_folder = get_files_from_everything(txt)
|
||||||
if os.path.exists(txt):
|
if not success:
|
||||||
project_folder = txt
|
# 什么都没有
|
||||||
else:
|
|
||||||
if txt == "": txt = '空空如也的输入栏'
|
if txt == "": txt = '空空如也的输入栏'
|
||||||
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到本地项目或无权访问: {txt}")
|
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到本地项目或无权访问: {txt}")
|
||||||
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
||||||
return
|
return
|
||||||
if txt.endswith('.md'):
|
|
||||||
file_manifest = [txt]
|
|
||||||
else:
|
|
||||||
file_manifest = [f for f in glob.glob(f'{project_folder}/**/*.md', recursive=True)]
|
|
||||||
if len(file_manifest) == 0:
|
if len(file_manifest) == 0:
|
||||||
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到任何.md文件: {txt}")
|
report_execption(chatbot, history, a = f"解析项目: {txt}", b = f"找不到任何.md文件: {txt}")
|
||||||
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
yield from update_ui(chatbot=chatbot, history=history) # 刷新界面
|
||||||
|
|||||||
@@ -88,14 +88,14 @@ class NewBingHandle(Process):
|
|||||||
if a not in self.local_history:
|
if a not in self.local_history:
|
||||||
self.local_history.append(a)
|
self.local_history.append(a)
|
||||||
prompt += a + '\n'
|
prompt += a + '\n'
|
||||||
if b not in self.local_history:
|
# if b not in self.local_history:
|
||||||
self.local_history.append(b)
|
# self.local_history.append(b)
|
||||||
prompt += b + '\n'
|
# prompt += b + '\n'
|
||||||
|
|
||||||
# 问题
|
# 问题
|
||||||
prompt += question
|
prompt += question
|
||||||
self.local_history.append(question)
|
self.local_history.append(question)
|
||||||
|
print('question:', prompt)
|
||||||
# 提交
|
# 提交
|
||||||
async for final, response in self.newbing_model.ask_stream(
|
async for final, response in self.newbing_model.ask_stream(
|
||||||
prompt=question,
|
prompt=question,
|
||||||
@@ -108,7 +108,8 @@ class NewBingHandle(Process):
|
|||||||
else:
|
else:
|
||||||
print('-------- receive final ---------')
|
print('-------- receive final ---------')
|
||||||
self.child.send('[Finish]')
|
self.child.send('[Finish]')
|
||||||
|
# self.local_history.append(response)
|
||||||
|
|
||||||
|
|
||||||
def run(self):
|
def run(self):
|
||||||
"""
|
"""
|
||||||
@@ -244,7 +245,7 @@ def predict(inputs, llm_kwargs, plugin_kwargs, chatbot, history=[], system_promp
|
|||||||
for response in newbing_handle.stream_chat(query=inputs, history=history_feedin, system_prompt=system_prompt, max_length=llm_kwargs['max_length'], top_p=llm_kwargs['top_p'], temperature=llm_kwargs['temperature']):
|
for response in newbing_handle.stream_chat(query=inputs, history=history_feedin, system_prompt=system_prompt, max_length=llm_kwargs['max_length'], top_p=llm_kwargs['top_p'], temperature=llm_kwargs['temperature']):
|
||||||
chatbot[-1] = (inputs, preprocess_newbing_out(response))
|
chatbot[-1] = (inputs, preprocess_newbing_out(response))
|
||||||
yield from update_ui(chatbot=chatbot, history=history, msg="NewBing响应缓慢,尚未完成全部响应,请耐心完成后再提交新问题。")
|
yield from update_ui(chatbot=chatbot, history=history, msg="NewBing响应缓慢,尚未完成全部响应,请耐心完成后再提交新问题。")
|
||||||
|
if response == "[Local Message]: 等待NewBing响应中 ...": response = "[Local Message]: NewBing响应异常,请刷新界面重试 ..."
|
||||||
history.extend([inputs, preprocess_newbing_out(response)])
|
history.extend([inputs, response])
|
||||||
yield from update_ui(chatbot=chatbot, history=history, msg="完成全部响应,请提交新问题。")
|
yield from update_ui(chatbot=chatbot, history=history, msg="完成全部响应,请提交新问题。")
|
||||||
|
|
||||||
|
|||||||
2
version
2
version
@@ -1,5 +1,5 @@
|
|||||||
{
|
{
|
||||||
"version": 3.3,
|
"version": 3.3,
|
||||||
"show_feature": true,
|
"show_feature": true,
|
||||||
"new_feature": "支持NewBing !! <-> 保存对话功能 <-> 解读任意语言代码+同时询问任意的LLM组合 <-> 添加联网(Google)回答问题插件 <-> 修复ChatGLM上下文BUG <-> 添加支持清华ChatGLM和GPT-4 <-> 改进架构,支持与多个LLM模型同时对话 <-> 添加支持API2D(国内,可支持gpt4)"
|
"new_feature": "支持NewBing <-> Markdown翻译功能支持直接输入Readme文件网址 <-> 保存对话功能 <-> 解读任意语言代码+同时询问任意的LLM组合 <-> 添加联网(Google)回答问题插件 <-> 修复ChatGLM上下文BUG <-> 添加支持清华ChatGLM和GPT-4 <-> 改进架构,支持与多个LLM模型同时对话 <-> 添加支持API2D(国内,可支持gpt4)"
|
||||||
}
|
}
|
||||||
|
|||||||
在新工单中引用
屏蔽一个用户