Another Nvidia GPU shortage approaches as AI tools gain popularity

An Nvidia GPU shortage

An Nvidia GPU shortage

After years of GPU shortages due to scalpers and crypto miners, the industry has finally returned to normal. Unfortunately, it seems that Nvidia GPUs are about to see another shortage as everyone gets sucked into AI.

According to a report from WCCFTech, Nvidia GPUs are set to be hard to find as AI users and companies sweep them all up. With the public getting into AI tools such as Stable Diffusion, Topaz AI and more, video cards in general are going to be purchased en-masse.

Furthermore, the largest AI tool in history has been trained on Nvidia graphics cards. OpenAI’s ChatGPT was trained on around 10,000 Nvidia cards. The next-gen version of the AI program is currently being trained on 25,000 GPUs.

Other AI companies such as Meta and Google DeepMind are not utilising Nvidia hardware to train their software. Instead, these companies make use of dedicated supercomputers to train tools. So, why does OpenAI rely on Nvidia?

Nvidia RTX cards are consumer tools that are designed for machine learning. The hardware’s Tensor Cores that power video game software such as DLSS are also powerful additions for AI training. As OpenAI doesn’t currently have access to its own supercomputer, Nvidia hardware has been adopted.

Stocks of Nvidia GPUs are already limited in multiple regions around the world at the time of writing. However, for the most part, consumers are currently able to get their hands on GPUs. Following the death of crypto mining, scalpers have largely given up hiking up prices of graphics cards.

In the future, graphics cards from other manufacturers may also be used in AI training. However, with AMD Radeon and Intel Arc far behind Nvidia in terms of AI performance, it may be years before they are affected by AI-caused shortages.

Will Nvidia do anything to counteract the oncoming shortage of GPUs?

This Article's Topics

Explore new topics and discover content that's right for you!

News
Have an opinion on this article? We'd love to hear it!