Update README.md
Browse files
README.md
CHANGED
|
@@ -147,11 +147,8 @@ For the complete language and dialect list, see [languages.md](./languages.md).
|
|
| 147 |
|:-------------:|:----------------:|
|
| 148 |
|**CUDA**|✅Supported|
|
| 149 |
|**MPS (Apple)**|✅Supported|
|
| 150 |
-
|**Ascend NPU (Huawei)**|✅Supported|
|
| 151 |
|**CPU**|✅Supported|
|
| 152 |
|
| 153 |
-
To run Dolphin on Ascend NPU, you need to install the corresponding `torch_npu` package and configure the environment `ASCEND_RT_VISIBLE_DEVICES`. The tested configuration is: `CANN==8.0.1`, `torch==2.2.0`, `torch_npu==2.2.0`. With this setup, the model has been verified to run inference correctly on the Ascend NPU.
|
| 154 |
-
|
| 155 |
|
| 156 |
|
| 157 |
## Usage
|
|
|
|
| 147 |
|:-------------:|:----------------:|
|
| 148 |
|**CUDA**|✅Supported|
|
| 149 |
|**MPS (Apple)**|✅Supported|
|
|
|
|
| 150 |
|**CPU**|✅Supported|
|
| 151 |
|
|
|
|
|
|
|
| 152 |
|
| 153 |
|
| 154 |
## Usage
|