DrSmoothl
742e21a727
feat: Add LLM Provider support in plugin runtime
...
- Introduced LLM Provider declarations in plugin manifests, allowing plugins to specify their LLM capabilities.
- Implemented validation for LLM Provider declarations to prevent duplicates and conflicts.
- Enhanced the PluginRunner to handle LLM Provider invocation requests, enabling plugins to interact with LLM Providers seamlessly.
- Added a ClientRegistry to manage LLM Provider registrations and ensure no conflicts arise between different plugins.
- Created a PluginLLMClient to facilitate communication with LLM Providers through the plugin runtime.
- Developed tests to ensure proper registration and conflict handling of LLM Providers.
2026-04-27 16:49:44 +08:00
SengokuCola
d32be4741a
feat:新增按顺序选择 fix:修复timing gate意外tool问题
2026-04-27 10:53:13 +08:00
SengokuCola
9759018a0c
feat:支持模型缓存和相关配置
2026-04-25 13:53:30 +08:00
SengokuCola
d9b3440169
feat:优化对多模态/非多模态replyer的配置
2026-04-11 19:30:23 +08:00
SengokuCola
243b8deb43
feat:展示更详细的工具信息,修改wait定义
2026-04-09 19:58:20 +08:00
SengokuCola
09bce14664
feat:为失败请求留档并提供重试分析
2026-04-07 15:16:06 +08:00
SengokuCola
50a51757a8
feat:添加提及必回,部分尺寸过大自动重试,移除无用配置项,正确解析at消息
2026-04-07 01:31:58 +08:00
SengokuCola
6c720e0403
ref:重构maisaka内置工具逻辑,拆分文件
2026-04-03 14:51:05 +08:00
SengokuCola
d713aa9576
feat:显示实时占用上下文,移除旧记忆系统
2026-04-01 13:18:17 +08:00
SengokuCola
503a257d66
remove:无用配置
2026-04-01 13:06:01 +08:00
SengokuCola
01ef29aadb
feat:重构maisaka的消息类型,添加打断功能
2026-03-30 00:45:41 +08:00
DrSmoothl
7a460a474d
feat: 更新多个文件以使用 SessionMessage 替代 MaiMessage,并调整相关逻辑
2026-03-28 13:39:48 +08:00
DrSmoothl
777d4cb0d2
feat: Enhance OpenAI compatibility and introduce unified LLM service data models
...
- Refactored model fetching logic to support various authentication methods for OpenAI-compatible APIs.
- Introduced new data models for LLM service requests and responses to standardize interactions across layers.
- Added an adapter base class for unified request execution across different providers.
- Implemented utility functions for building OpenAI-compatible client configurations and request overrides.
2026-03-26 16:15:42 +08:00
SengokuCola
a5fc4d172d
feat:提供原生vlm支持
2026-03-24 20:57:57 +08:00
DrSmoothl
2a33fd1121
refactor(llm): enable hot-reload for model config and client runtime
...
make LLM task config resolution dynamic in LLMRequest
load model clients on demand from latest config
clear client instance cache on config reload
remove stale module-level model_config usage in llm_api
add hot-reload tests for LLM/config watcher flow
2026-03-04 21:56:50 +08:00
DrSmoothl
5cccdf6715
feat(llm): 添加响应格式转换功能,支持JSON_SCHEMA输出
2026-03-04 21:11:10 +08:00
DrSmoothl
dc36542403
添加文件监视器地基模块,重构模型请求模块使用新版本的配置热重载模块,新增watchfiles依赖
2026-02-14 21:17:24 +08:00
UnCLAS-Prommer
3a66bfeac1
恢复可用性
2026-01-16 23:03:45 +08:00
墨梓柒
7bdd394bf0
将PFC加回来,修复一大堆PFC的神秘报错
2026-01-16 03:36:25 +08:00
UnCLAS-Prommer
77725ba9d8
逐步适配新的config
2026-01-15 23:51:19 +08:00
SengokuCola
0debe0efcf
log:优化模型报错log
2026-01-12 19:05:30 +08:00
SengokuCola
f92136bffc
feat;模型选择现在可以使用完全随机的策略
...
Update model_config_template.toml
2025-12-27 17:34:26 +08:00
墨梓柒
e680a4d1f5
Ruff format
2025-12-13 17:14:09 +08:00
墨梓柒
12bc661790
feat: 添加模型级别最大token数配置,并更新相关逻辑以支持优先级处理
2025-12-03 11:45:15 +08:00
墨梓柒
d97c6aa948
feat: 添加模型级别温度配置并优化温度优先级处理逻辑
2025-12-03 10:45:20 +08:00
Ronifue
6470d27270
feat: 统一对task中过慢的模型进行警告,并在model_config.toml中设定对应task的慢请求阈值
2025-11-30 01:23:29 +08:00
Ronifue
e6d1a6e87b
feat: 为所有LLM的请求异常添加原始错误显示
2025-11-29 20:22:25 +08:00
Ronifue
79e8962f6f
feat: 使得model_info.extra_params能够单独指定模型的temprature
2025-11-29 18:15:46 +08:00
墨梓柒
3935ce817e
Ruff Fix & format
2025-11-29 14:38:42 +08:00
Ronifue
a58c54d378
feat: 更加详细的模型API请求错误提示与添加遇到常见的频繁API请求超时的建议
2025-11-27 18:48:51 +08:00
墨梓柒
44f427dc64
Ruff fix
2025-11-19 23:35:14 +08:00
SengokuCola
d306e40db0
Merge branch 'dev' of https://github.com/Mai-with-u/MaiBot into dev
2025-11-13 19:00:59 +08:00
SengokuCola
f2819be5e9
feat:lpmm可选接入memory agent,将memory agent改为标准工具格式,修改llm_utils以兼容
2025-11-13 18:55:37 +08:00
墨梓柒
7839acd25d
Ruff fix
2025-11-13 13:24:55 +08:00
SengokuCola
f3f7b10fb6
Merge branch 'dev' of https://github.com/Mai-with-u/MaiBot into dev
2025-11-03 22:42:06 +08:00
SengokuCola
3e5058eb0f
fix:优化记忆提取,提供细节prompt debug项目
2025-11-03 22:41:21 +08:00
exynos
b63057edec
fix(model_utils): HTTP 400 不终止全局尝试,继续切换模型
2025-11-02 17:33:58 +08:00
墨梓柒
e9a5488b62
Ruff Fix
2025-09-28 00:02:18 +08:00
SengokuCola
20013a1a2c
log:修改一些log
2025-09-25 19:07:35 +08:00
UnCLAS-Prommer
fad8b82d8b
修复负载均衡
2025-09-23 20:44:48 +08:00
UnCLAS-Prommer
1260a11b78
fix typing of utils_model.py
2025-09-17 15:59:02 +08:00
google-labs-jules[bot]
01b06ed302
fix: 清理无用异常,现用 RespNotOkException 加上状态码,且将 429 和 5xx 错误的处理逻辑从“硬失败”移回“可重试”
2025-09-12 18:50:10 +08:00
SengokuCola
a4285673aa
feat:改为单planner,并解析多个动作
2025-09-11 14:25:02 +08:00
UnCLAS-Prommer
82e5a710c3
action的reply_message设置为数据模型,维护typing以及增强稳定性
2025-08-28 23:44:14 +08:00
google-labs-jules[bot]
6483955919
fix(llm): Add retry mechanism for empty API responses
2025-08-25 19:41:13 +08:00
UnCLAS-Prommer
a68c68cbe9
稍微完善一下空回复warn
2025-08-21 00:48:24 +08:00
墨梓柒
fab4656185
优化异步处理,避免事件循环问题并增强错误日志记录
2025-08-19 17:05:07 +08:00
SengokuCola
268b428e8f
feat: llm统计现已记录模型反应时间
2025-08-11 21:51:59 +08:00
UnCLAS-Prommer
41e8966ae7
更多events
2025-08-09 17:33:24 +08:00
UnCLAS-Prommer
d65f90ee49
增加缓存层提高性能
2025-08-09 11:40:29 +08:00