NVIDIA’s key toolkit packed with vulnerabilities, research finds


Nine newly discovered vulnerabilities in NVIDIA’s CUDA Toolkit could expose developers to security risks.

NVIDIA’s essential tool for generative AI, machine learning, and scientific computing has been vulnerable to attack, according to the new findings by threat researchers at Palo Network’s Unit 42.

Created in 2006, NVIDIA’s CUDA is a parallel computing platform that uses graphics processing units (GPUs) to speed up applications.

ADVERTISEMENT

Developers use this platform for tasks that need high-speed parallel processing. The CUDA Toolkit works on both Windows and Linux, where the developed code is stored in CUDA binary files.

Researchers have uncovered nine vulnerabilities affecting CUDA tools known as cuobjdump and nvdisasm. These tools help developers analyze CUDA binary files for programs running on NVIDIA GPU hardware.

Ernestas Naprys vilius Paulina Okunyte Gintaras Radauskas
Don’t miss our latest stories on Google News

Although cuobjdump and nvdisasm do not directly execute CUDA code, they are crucial for developers to inspect and optimize their GPU programs. Unit 42 researchers indicate that older versions of these tools could be exploited by processing malicious cubin files.

The discovered vulnerabilities are low severity, rated from 2.8 to 3.3 on the common vulnerability scoring system (CVSS). Still, if exploited by threat actors, they could lead to limited denial of service, information disclosure, or, in some cases, code execution.

NVIDIA responded to the findings by releasing updates in February to patch the issues.

The discovered vulnerabilities are tracked as:

  • CVE-2024-53870
  • CVE-2024-53871
  • CVE-2024-53872
  • CVE-2024-53873
  • CVE-2024-53874
  • CVE-2024-53875
  • CVE-2024-53876
  • CVE-2024-53877
  • CVE-2024-53878
ADVERTISEMENT
cuobjdump
An example of cuobjdump. Source: Unit 42

Accelerating quantum computing

Nvidia currently has a 92% market share in datacenter GPUs, securing the leading position in AI training.

In November 2024, Nvidia and Google announced that they were teaming up to speed up the design of next-generation quantum computing devices. According to the statements, Google’s Quantum AI unit will use Nvidia’s Eos supercomputer to simulate the physics of its quantum processors.

Google will also use the chipmaker’s hybrid quantum-classical computing platform CUDA-Q to accelerate the development of new quantum components needed to break the next technological barrier.

Some tech companies, including Meta, Google, and Microsoft, are contributing to the development of CUDA’s competing software, Triton. Triton is designed to make code run software on a wide range of AI chips, and it would be a competitor to CUDA.