In addition, while Nvidia has a comprehensive software ecosystem named CUDA, this is more focused on training utilizing high-performance graphics processing units like the H100. Inference workloads will be less reliant on CUDA, so AMD has an opportunity to compete with its open-source ROCm stack and cost-efficient solutions.