Installation¶
The package can be installed from PyPI.
Quick Install¶
# CPU only / CoreML for Apple Silicon (simplest install, minimal native deps)
pip install onnx-asr[cpu,hub]
# With NVIDIA GPU support (requires installed CUDA/cuDNN/TensorRT)
pip install onnx-asr[gpu,hub]
# Using uv
uv pip install onnx-asr[cpu,hub]
Requirements¶
ONNX Runtime Packages¶
onnx-asr requires an ONNX Runtime package. There are several options depending on your OS and hardware:
| Package | Providers | Notes |
|---|---|---|
onnxruntime |
CPUExecutionProvider, CoreMLExecutionProvider |
Default, works on all platforms |
onnxruntime-gpu |
CPUExecutionProvider, CUDAExecutionProvider, TensorrtExecutionProvider |
For NVIDIA GPUs (requires NVIDIA libs) |
onnxruntime-directml, onnxruntime-windowsml (new) |
CPUExecutionProvider, DmlExecutionProvider |
DirectML - only for Windows but no additional deps |
onnxruntime-webgpu (beta) |
CPUExecutionProvider, WebGpuExecutionProvider |
WebGPU - cross-platform and no additional deps |
Additional ONNX Runtime packages (not tested):
onnxruntime-trt-rtx- NVIDIA TensorRT for RTXonnxruntime-qnn- Qualcomm Snapdragononnxruntime-openvino- Intel OpenVINOonnxruntime-rocm- AMD GPUs (legacy)onnxruntime-migraphx- AMD GPUs (new)onnxruntime-cann- Huawei Ascend NPU
Note
Only onnxruntime and onnxruntime-gpu have predefined extras ([cpu] and [gpu]). Other packages can be installed manually.
Supported Platforms¶
The supported platforms are primarily determined by available ONNX Runtime wheels.
| OS / CPU | Python 3.10 | Python 3.11 | Python 3.12 | Python 3.13 | Python 3.14 | Python 3.13t | Python 3.14t |
|---|---|---|---|---|---|---|---|
| Linux x86_64 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Linux Arm64 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Windows x86_64 | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ |
| Windows Arm64 | ❌ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ |
| macOS Arm64 (Apple Silicon) | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ |
| macOS x86_64 | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ |
Minimum Dependency Versions¶
| Package | Minimum Version |
|---|---|
| numpy | 1.22.4 |
| onnxruntime | 1.18.1 (or any ONNX Runtime package) |
| huggingface-hub | 0.30.2 (optional, for model downloading) |
| typing-extensions | 4.6.0 (Python < 3.11 only) |
Note
Older versions of ONNX Runtime packages may work but are not tested. The minimum version listed is the one used in CI testing.
Install from PyPI¶
-
With CPU
onnxruntimeandhuggingface-hub:pip install onnx-asr[cpu,hub] -
With
onnxruntimefor NVIDIA GPUs andhuggingface-hub:pip install onnx-asr[gpu,hub]Warning
First, you need to install the required version of CUDA/cuDNN for
CUDAExecutionProviderand required TensorRT forTensorrtExecutionProvider(optional).You can also install
onnxruntimeCUDA/cuDNN dependencies and TensorRT via pip:pip install onnxruntime-gpu[cuda,cudnn] tensorrt-cu12-libs -
With
onnxruntimefor WinML andhuggingface-hub:pip install onnx-asr[hub] onnxruntime-windowsml -
Without
onnxruntimeandhuggingface-hub(if you already have some version ofonnxruntimeinstalled and prefer to download the models yourself):pip install onnx-asr
Install from source¶
To install the latest version of onnx-asr from sources, use pip (or uv pip):
pip install git+https://github.com/istupakov/onnx-asr