Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -684,7 +684,7 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing

<sup>1</sup> Accuracy metric is the End-to-End Evaluation Overall score of OmniDocBench (v1.5), tested on the latest `MinerU` version.
<sup>2</sup> Linux supports only distributions released in 2019 or later.
<sup>3</sup> MLX requires macOS 13.5 or later, recommended for use with version 14.0 or higher.
<sup>3</sup> MLX requires macOS 13.5 or later, recommended for use with version 14.0 or higher.
<sup>4</sup> Windows vLLM support via WSL2(Windows Subsystem for Linux).
<sup>5</sup> Servers compatible with the OpenAI API, such as local or remote model services deployed via inference frameworks like `vLLM`, `SGLang`, or `LMDeploy`.

Expand Down
4 changes: 2 additions & 2 deletions README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -670,8 +670,8 @@ https://github.com/user-attachments/assets/4bea02c9-6d54-4cd6-97ed-dff14340982c
</table>

<sup>1</sup> 精度指标为OmniDocBench (v1.5)的End-to-End Evaluation Overall分数,基于`MinerU`最新版本测试
<sup>2</sup> Linux仅支持2019年及以后发行版
<sup>3</sup> MLX需macOS 13.5及以上版本支持,推荐14.0以上版本使用
<sup>2</sup> Linux仅支持2019年及以后发行版
<sup>3</sup> MLX需macOS 13.5及以上版本支持,推荐14.0以上版本使用
<sup>4</sup> Windows vLLM通过WSL2(适用于 Linux 的 Windows 子系统)实现支持
<sup>5</sup> 兼容OpenAI API的服务器,如通过`vLLM`/`SGLang`/`LMDeploy`等推理框架部署的本地模型服务器或远程模型服务

Expand Down
2 changes: 1 addition & 1 deletion mineru/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "2.6.2"
__version__ = "2.6.3"