binary-husky
9b0b2cf260
auto hide tooltip when scoll down
2025-01-28 23:32:40 +08:00
binary-husky
9f39a6571a
feat: customized font & font size
2025-01-28 02:52:56 +08:00
memset0
d07e736214
fix unpacking
2025-01-25 00:00:13 +08:00
memset0
a1f7ae5b55
feat: add support for R1 model and display CoT
2025-01-24 14:43:49 +08:00
binary-husky
1213ef19e5
Merge branch 'master' of github.com:binary-husky/chatgpt_academic
2025-01-22 01:50:08 +08:00
binary-husky
aaafe2a797
fix xelatex font problem in all-cap image
2025-01-22 01:49:53 +08:00
binary-husky
2716606f0c
Update README.md
2025-01-16 23:40:24 +08:00
binary-husky
286f7303be
fix image display bug
2025-01-12 21:54:43 +08:00
binary-husky
7eeab9e376
fix code block display bug
2025-01-09 22:31:59 +08:00
binary-husky
4ca331fb28
prevent html rendering for input
2025-01-05 21:20:12 +08:00
binary-husky
9487829930
change max_chat_preserve = 10
2025-01-03 00:34:36 +08:00
binary-husky
a73074b89e
upgrade chat checkpoint
2025-01-03 00:31:03 +08:00
Southlandi
fd93622840
修复Gemini对话错误问题(停用词数量为0的情况) ( #2092 )
2024-12-28 23:22:10 +08:00
whyXVI
09a82a572d
Fix RuntimeError in predict_no_ui_long_connection() ( #2095 )
...
Bug fix: Fix RuntimeError in predict_no_ui_long_connection()
In the original code, calling predict_no_ui_long_connection() would trigger a RuntimeError("OpenAI拒绝了请求:" + error_msg) even when the server responded normally. The issue occurred due to incorrect handling of SSE protocol comment lines (lines starting with ":").
Modified the parsing logic in both `predict` and `predict_no_ui_long_connection` to handle these lines correctly, making the logic more intuitive and robust.
2024-12-28 23:21:14 +08:00
G.RQ
c53ddf65aa
修复 bug“重置”按钮报错 ( #2102 )
...
* fix 重置按钮bug
* fix version control bug
---------
Co-authored-by: binary-husky <qingxu.fu@outlook.com >
2024-12-28 23:19:25 +08:00
binary-husky
ac64a77c2d
allow disable openai proxy in WHEN_TO_USE_PROXY
2024-12-28 07:14:54 +08:00
binary-husky
dae8a0affc
compat bug fix
2024-12-25 01:21:58 +08:00
binary-husky
97a81e9388
fix temp issue of o1
2024-12-25 00:54:03 +08:00
binary-husky
1dd1d0ed6c
fix cookie overflow bug
2024-12-25 00:33:20 +08:00
binary-husky
060af0d2e6
Merge branch 'master' of github.com:binary-husky/chatgpt_academic
2024-12-22 23:33:44 +08:00
binary-husky
a848f714b6
fix welcome card bugs
2024-12-22 23:33:22 +08:00
binary-husky
924f8e30c7
Update issue stale.yml
2024-12-22 14:16:18 +08:00
binary-husky
f40347665b
github action change
2024-12-22 14:15:16 +08:00
binary-husky
734c40bbde
fix non-localhost javascript error
2024-12-22 14:01:22 +08:00
binary-husky
4ec87fbb54
history ng patch 1
2024-12-21 11:27:53 +08:00
binary-husky
17b5c22e61
Merge branch 'master' of github.com:binary-husky/chatgpt_academic
2024-12-19 22:46:14 +08:00
binary-husky
c6cd04a407
promote the rank of DASHSCOPE_API_KEY
2024-12-19 22:39:14 +08:00
YIQI JIANG
f60a12f8b4
Add o1 and o1-2024-12-17 model support ( #2090 )
...
* Add o1 and o1-2024-12-17 model support
* patch api key selection
---------
Co-authored-by: 蒋翌琪 <jiangyiqi99@jiangyiqideMacBook-Pro.local >
Co-authored-by: binary-husky <qingxu.fu@outlook.com >
2024-12-19 22:32:57 +08:00
binary-husky
8413fb15ba
optimize welcome page
version3.91
2024-12-18 23:35:25 +08:00
binary-husky
72b2ce9b62
ollama patch
2024-12-18 23:05:55 +08:00
binary-husky
f43ef909e2
roll version to 3.91
2024-12-18 22:56:41 +08:00
binary-husky
9651ad488f
Merge branch 'master' into frontier
2024-12-18 22:27:12 +08:00
binary-husky
81da7bb1a5
remove welcome card when layout overflows
2024-12-18 17:48:02 +08:00
binary-husky
4127162ee7
add tts test
2024-12-18 17:47:23 +08:00
binary-husky
98e5cb7b77
update readme
2024-12-09 23:57:10 +08:00
binary-husky
c88d8047dd
cookie storage to local storage
2024-12-09 23:52:02 +08:00
binary-husky
e4bebea28d
update requirements
2024-12-09 23:40:23 +08:00
YE Ke 叶柯
294df6c2d5
Add ChatGLM4 local deployment support and refactor ChatGLM bridge's path configuration ( #2062 )
...
* ✨ feat(request_llms and config.py): ChatGLM4 Deployment
Add support for local deployment of ChatGLM4 model
* 🦄 refactor(bridge_chatglm3.py): ChatGLM3 model path
Added ChatGLM3 path customization (in config.py).
Removed useless quantization model options that have been annotated
---------
Co-authored-by: MarkDeia <17290550+MarkDeia@users.noreply.github.com >
2024-12-07 23:43:51 +08:00
Zhenhong Du
239894544e
Add support for grok-beta model from x.ai ( #2060 )
...
* Update config.py
add support for `grok-beta` model
* Update bridge_all.py
add support for `grok-beta` model
2024-12-07 23:41:53 +08:00
Menghuan
ed5fc84d4e
添加为windows的环境打包以及一键启动脚本 ( #2068 )
...
* 新增自动打包windows下的环境依赖
---------
Co-authored-by: binary-husky <qingxu.fu@outlook.com >
2024-12-07 23:41:02 +08:00
Menghuan
e3f84069ee
改进Doc2X请求,并增加xelatex编译的支持 ( #2058 )
...
* doc2x请求函数格式清理
* 更新中间部分
* 添加doc2x超时设置并添加对xelatex编译的支持
* Bug修复以及增加对xelatex安装的检测
* 增强弱网环境下的稳定性
* 修复模型中_无法显示的问题
* add xelatex logs
---------
Co-authored-by: binary-husky <qingxu.fu@outlook.com >
2024-12-07 23:23:59 +08:00
binary-husky
7bf094b6b6
remove
2024-12-07 22:43:03 +08:00
binary-husky
57d7dc33d3
sync common.js
2024-12-07 17:10:01 +08:00
binary-husky
94ccd77480
remove gen restore btn
2024-12-07 16:22:29 +08:00
binary-husky
48e53cba05
update gradio
2024-12-07 16:18:05 +08:00
binary-husky
e9a7f9439f
upgrade gradio
2024-12-07 15:59:30 +08:00
binary-husky
a88b119bf0
change urls
2024-12-05 22:13:59 +08:00
binary-husky
eee8115434
add a config note
2024-12-04 23:55:22 +08:00
binary-husky
4f6a272113
remove keyword extraction
2024-12-04 01:33:31 +08:00
binary-husky
cf3dd5ddb6
add fail fallback option for media plugin
2024-12-04 01:06:12 +08:00