Install MIGraphX for Radeon GPUs#

MIGraphX is AMD’s graph inference engine that accelerates machine learning model inference, and can be used to accelerate workloads within the Torch MIGraphX and ONNX Runtime backend frameworks.

  • Torch-MIGraphX, which integrates MIGraphX with PyTorch

  • MIGraphX for ONNX Runtime backend, which integrates MIGraphX with ONNX

    ONNX Runtime can be driven by either the ROCm™ Execution Provider (EP) or MIGraphX EP

Introduction to MIGraphX#

MIGraphX is a graph optimizer that accelerates the inference for deep learning models. It provides C++ and Python APIs that are integrated within frameworks like Torch MIGraphX, ONNX Runtime, and other user solutions. The following process summarizes the procedures that occur under-the-hood during the optimization and real-time compilation process.

MIGraphX accelerates the Machine Learning models by leveraging several graph-level transformations and optimizations. These optimizations include:

  • Operator fusion

  • Arithmetic simplifications

  • Dead-code elimination

  • Common subexpression elimination (CSE)

  • Constant propagation

When the aforementioned optimizations are applied, MIGraphX emits code for the AMD GPU by calling to MIOpen, rocBLAS, or creating HIP kernels for a particular operator. MIGraphX can also target CPUs using DNNL or ZenDNN libraries.

For more information on how to install MIGraphX, refer to AMD MIGraphX Github.

Prerequisites#

Install MIGraphX#

Install MIGraphX on your computer. Once the install is completed and verified, proceed to install Torch-MIGraphX or MIGraphX for ONNX Runtime.

Run the following command to install MIGraphX:

$ sudo apt install migraphx

Next, proceed to install Torch-MIGraphX or MIGraphX for ONNX Runtime as applicable.

Install Torch-MIGraphX#

Install Torch-MIGraphX using the Docker installation method, or build from source.

Using Docker provides portability, and access to a prebuilt Docker container that has been rigorously tested within AMD. Docker also cuts down compilation time, and should perform as expected without installation issues.

  1. Clone the torch_migraphx repository.

    git clone https://github.com/ROCmSoftwarePlatform/torch_migraphx.git
    
  2. Change directory to torch-migraphx.

    cd torch_migraphx/
    
  3. Build image using the provided script.

    sudo ./build_image.sh
    
  4. Run container.

    sudo docker run -it --network=host --device=/dev/kfd --device=/dev/dri --group-add=video --ipc=host --cap-add=SYS_PTRACE --security-opt seccomp=unconfined torch_migraphx
    

Next, verify the Torch-MIGraphX installation.

To build from source in a custom environment, refer to the torch_migraphx repository for build steps.

Next, verify the Torch-MIGraphX installation.

Verify Torch-MIGraphX installation#

Verify if the Torch-MIGraphX installation is successful.

  1. Verify if torch_migraphx can be imported as a Python module.

    python3 -c 'import torch_migraphx' 2> /dev/null && echo 'Success' || echo 'Failure'
    
  2. Run unit tests.

    pytest ./torch_migraphx/tests
    

Installation is complete and the system is able to run PyTorch through the python interface library, and scripts that invoke PyTorch inference sessions.

Install and verify MIGraphX for ONNX Runtime#

See Install ONNX Runtime for Radeon GPUs for MIGraphX for ONNX Runtime installation and verification instructions.