Install ONNX Runtime for Radeon GPUs#
Overview#
Ensure that the following prerequisite installations are successful before proceeding to install ONNX Runtime for use with ROCm™ on Radeon™ GPUs.
Prerequisites#
Radeon Software for Linux (with ROCm) is installed.
MIGraphX is installed. This enables ONNX Runtime to build the correct MIGraphX Execution Provider (EP).
The half library is installed. See Verify if MIGraphX is installed with the half library.
If the prerequisite installations are successful, proceed to install ONNX Runtime.
NOTE Unless adding custom features, use the pre-built Python wheel files provided in the PIP installation method.
Verify MIGraphX installation#
Verify if MIGraphX is installed with the half library
$ dpkg -l | grep migraphx $ dpkg -l | grep half
Expected result:
root@aus-navi3x-02:/workspace/AMDMIGraphX# dpkg -l | grep migraphx ii migraphx 2.9.0 amd64 AMD's graph optimizer ii migraphx-dev 2.9.0 amd64 AMD's graph optimizer ii migraphx-tests 2.9.0 amd64 AMD's graph opt $ dpkg -l | grep half ii half 1.12.0.60000-91~20.04 amd64 HALF-PRECISION FLOATING POINT LIBRARY
Note
Versions may vary between ROCm builds and installed versions of MIGraphX, but the desired result is the same.The half library should come packaged with MIGraphX. If not, it can be installed with the following command.
sudo apt install half
Perform a simple inference with MIGraphX to verify the installation.
/opt/rocm-6.2.3/bin/migraphx-driver perf --test
Install ONNX Runtime#
Important!
Use the provided pre-built Python wheel files from the PIP installation method, unless adding custom features.
The wheel file contains the MIGraphX and ROCm Execution Providers (EP). Refer to Install MIGraphX for ONNX RT for more information.
Refer to ONNX Runtime Documentation for additional information on ONNX Runtime topics.
See ONNX Runtime Tutorials to try out real applications and tutorials on how to get started.
Option A: ONNX Runtime install via PIP installation method (Recommended)
AMD recommends the PIP install method to create an ONNX Runtime environment when working with ROCm for machine learning development.
Note
The latest version of Python module numpy v2.0 is incompatible with the ONNX Runtime wheels for this version. Downgrade to an older version is required.
Example: pip3 install numpy==1.26.4
To install via PIP,
Enter this command to download and install the ONNX Runtime wheel.
pip3 uninstall onnxruntime-rocm
pip3 install onnxruntime-rocm -f https://repo.radeon.com/rocm/manylinux/rocm-rel-6.2.3/
Option B: Build from source for your environment, followed by local wheel file installation (Advanced)
Use this method for advanced customization usecases. This requires the user install the desired ROCm and MIGraphX versions, and creation of softlink prior to starting the build.
NOTE The build time typically takes ~45 minutes.
Prerequisites to build ONNX from source
Radeon Software for Linux (with ROCm) is installed
MigraphX is installed
Softlink is created
To create a softlink for
/opt/rocm
, enter the following command:language = bash linenumbers = true ln -s /opt/rocm* /opt/rocm
To build from source,
Clone the ONNX Runtime repository into the root directory.
cd / git clone https://github.com/microsoft/onnxruntime.git
Git clone AMDMIGraphX into the home folder.
cd ~ git clone https://github.com/ROCm/AMDMIGraphX.git
Create a docker image for MIGraphX.
Note
Refer to AMDMIGraphX Github for up-to-date ONNX Runtime and MIGraphX dependencies.
MIGraphX can still be built or installed from apt.
For MIGraphX package builds via RBuild, refer to these MIGraphx Github instructions to build within the docker container environment.
For MIGraphX package builds via CMake, refer to these MIGraphx Github instructions to build within the docker container environment.
Use the
groups
command to ensure that the user is part of the video, render, and docker groups in Linux to run the docker container.groups tthemist@aus-navi3x-02 ~/groups tthemist sudo video render docker
Run the following for a simple MIGraphX apt install:
cd AMDMIGraphX docker build -t migraphx . docker run --device='/dev/kfd' --device='/dev/dri' -v=`pwd`:/code/AMDMIGraphX -v /onnxruntime:/onnxruntime -w /code/AMDMIGraphX --group-add video -it migraphx apt install migraphx migraphx-dev half
Run rocm-smi to ensure that ROCm is installed and detects the supported GPU(s).
$ rocm-smi
Expected result:
======================================= ROCm System Management Interface ======================================= ================================================= Concise Info ================================================= Device [Model : Revision] Temp Power Partitions SCLK MCLK Fan Perf PwrCap VRAM% GPU% Name (20 chars) (Edge) (Avg) (Mem, Compute) ================================================================================================================ 0 [0x0e0d : 0x00] 32.0°C 73.0W N/A, N/A 1526Mhz 96Mhz 31.76% auto 241.0W 0% 50% 0x7448 ================================================================================================================ ============================================= End of ROCm SMI Log ==============================================
Configure Git to treat all directories as safe to use and run the build script.
cd AMDMIGraphX git config --global --add safe.directory "*" tools/build_and_test_onnxrt.sh
This builds ONNX Runtime and adds ROCm and MIGraphX EP support to the ONNX Runtime interface and requires multiple external repo pieces be checked out automatically prior to the build.
Install ONNX Runtime once MIGraphX is built.
$ pip3 install /onnxruntime/build/Linux/Release/dist/*.whl
Verify ONNX Runtime installation#
Verify that the install works correctly by performing a simple inference with MIGraphX.
python3 -c "import onnxruntime as ort; print(ort.get_available_providers())"
Expected result: The following EPs are displayed.
>>> import onnxruntime as ort
>>> ort.get_available_providers()
['MIGraphXExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider']
This indicates that the MIGraphXExecutionProvider
and ROCMExecutionProvider
are now running on the system, and the proper ONNX Runtime package has been installed.
Installation is complete and ONNX Runtime is available through the Python interface library, as well as scripts that invoke ONNX Runtime inference sessions.
For more information on the ONNX Runtime Python library, refer to Get started with ONNX Runtime in Python.