You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**MindHF** stands for **MindSpore + HuggingFace**, representing seamless compatibility with the HuggingFace ecosystem. The name also embodies **Harmonious & Fluid**, symbolizing our commitment to balancing compatibility with high performance. MindHF enables you to leverage the best of both worlds: the rich HuggingFace model ecosystem and MindSpore's powerful acceleration capabilities.
22
-
23
-
> **Note**: MindHF (formerly MindNLP) is the new name for this project. The `mindnlp` package name is still available for backward compatibility, but we recommend using `mindhf` going forward.
21
+
**MindNLP** stands for **MindSpore + Natural Language Processing**, representing seamless compatibility with the HuggingFace ecosystem. MindNLP enables you to leverage the best of both worlds: the rich HuggingFace model ecosystem and MindSpore's powerful acceleration capabilities.
24
22
25
23
## Table of Contents
26
24
27
-
-[MindHF](#-mindhf)
25
+
-[MindNLP](#-mindnlp)
28
26
-[Table of Contents](#table-of-contents)
29
27
-[Features ✨](#features-)
30
28
-[Installation](#installation)
@@ -45,7 +43,7 @@
45
43
46
44
### 1. 🤗 Full HuggingFace Compatibility
47
45
48
-
MindHF provides seamless compatibility with the HuggingFace ecosystem, enabling you to run any Transformers/Diffusers models on MindSpore across all hardware platforms (GPU/Ascend/CPU) without code modifications.
46
+
MindNLP provides seamless compatibility with the HuggingFace ecosystem, enabling you to run any Transformers/Diffusers models on MindSpore across all hardware platforms (GPU/Ascend/CPU) without code modifications.
49
47
50
48
#### Direct HuggingFace Library Usage
51
49
@@ -55,7 +53,7 @@ You can directly use native HuggingFace libraries (transformers, diffusers, etc.
> **Note**: Due to differences in autograd and parallel execution mechanisms, any training or distributed execution code must utilize the interfaces provided by MindHF.
94
+
> **Note**: Due to differences in autograd and parallel execution mechanisms, any training or distributed execution code must utilize the interfaces provided by MindNLP.
97
95
98
96
### 2. ⚡ High-Performance Features Powered by MindSpore
99
97
100
-
MindHF leverages MindSpore's powerful capabilities to deliver exceptional performance and unique features:
98
+
MindNLP leverages MindSpore's powerful capabilities to deliver exceptional performance and unique features:
101
99
102
100
#### PyTorch-Compatible API with MindSpore Acceleration
103
101
104
-
MindHF provides `mindtorch` (accessible via `mindhf.core`) for PyTorch-compatible interfaces, enabling seamless migration from PyTorch code while benefiting from MindSpore's acceleration on Ascend hardware:
102
+
MindNLP provides `mindtorch` (accessible via `mindnlp.core`) for PyTorch-compatible interfaces, enabling seamless migration from PyTorch code while benefiting from MindSpore's acceleration on Ascend hardware:
105
103
106
104
```python
107
-
importmindhf# Automatically enables proxy for torch APIs
105
+
importmindnlp# Automatically enables proxy for torch APIs
108
106
import torch
109
107
from torch import nn
110
108
111
-
# All torch.xx APIs are automatically mapped to mindhf.core.xx (via mindtorch)
109
+
# All torch.xx APIs are automatically mapped to mindnlp.core.xx (via mindtorch)
MindHF extends MindSpore with several advanced features for better model development:
118
+
MindNLP extends MindSpore with several advanced features for better model development:
121
119
122
120
1.**Dispatch Mechanism**: Operators are automatically dispatched to the appropriate backend based on `Tensor.device`, enabling seamless multi-device execution.
123
121
2.**Meta Device Support**: Perform shape inference and memory planning without actual computations, significantly speeding up model development and debugging.
@@ -130,27 +128,25 @@ These features enable better support for model serialization, heterogeneous comp
130
128
131
129
#### Install from Pypi
132
130
133
-
You can install the official version of MindHF which is uploaded to pypi.
131
+
You can install the official version of MindNLP which is uploaded to pypi.
134
132
135
133
```bash
136
-
pip install mindhf
134
+
pip install mindnlp
137
135
```
138
136
139
-
> **Note**: The `mindnlp` package name is still available for backward compatibility, but we recommend using `mindhf` going forward.
140
-
141
137
#### Daily build
142
138
143
-
You can download MindHF daily wheel from [here](https://repo.mindspore.cn/mindspore-lab/mindhf/newest/any/).
139
+
You can download MindNLP daily wheel from [here](https://repo.mindspore.cn/mindspore-lab/mindnlp/newest/any/).
Since there are too many supported models, please check [here](https://mindhf.cqu.ai/supported_models)
171
+
Since there are too many supported models, please check [here](https://mindnlp.cqu.ai/supported_models)
176
172
177
173
<!-- ## Tutorials
178
174
@@ -191,7 +187,7 @@ The dynamic version is still under development, if you find any issue or have an
191
187
192
188
## MindSpore NLP SIG
193
189
194
-
MindSpore NLP SIG (Natural Language Processing Special Interest Group) is the main development team of the MindHF framework. It aims to collaborate with developers from both industry and academia who are interested in research, application development, and the practical implementation of natural language processing. Our goal is to create the best NLP framework based on the domestic framework MindSpore. Additionally, we regularly hold NLP technology sharing sessions and offline events. Interested developers can join our SIG group using the QR code below.
190
+
MindSpore NLP SIG (Natural Language Processing Special Interest Group) is the main development team of the MindNLP framework. It aims to collaborate with developers from both industry and academia who are interested in research, application development, and the practical implementation of natural language processing. Our goal is to create the best NLP framework based on the domestic framework MindSpore. Additionally, we regularly hold NLP technology sharing sessions and offline events. Interested developers can join our SIG group using the QR code below.
0 commit comments