- Nvidia announced Blackwell, a new generation of AI chips, and the GB200, the first Blackwell chip shipping later this year with huge performance gains over previous chips.
- The GB200 combines two Blackwell GPUs and an Arm processor for 20 petaflops of AI performance, far beyond the 4 petaflops of the previous H100 chip.
- Cloud providers like Amazon, Google, and Microsoft will offer access to GB200 to train much bigger AI models, with Amazon building a cluster with 20,000 GB200 chips.
Nvidia announced a new generation of artificial intelligence chips and software for running AI models at its developer conference on Monday. The announcement comes as the chipmaker aims to solidify its position as the leading AI supplier.
Key Details About the New Chips
The new AI graphics processors are named Blackwell. The first Blackwell chip is the GB200 and will ship later this year.
The GB200 combines two B200 Blackwell GPUs with one Arm-based central processor. It offers 20 petaflops of AI performance, a huge jump from the 4 petaflops of the previous H100 chip.
The additional power will allow companies to train bigger and more advanced AI models, according to Nvidia. The chip has a transformer engine tailored for transformers-based AI models like ChatGPT.
Comparing Blackwell to Previous Chips
Nvidia CEO Jensen Huang said “Hopper is fantastic but we need bigger GPUs” when announcing Blackwell. The company updates its GPU architecture every two years.
Each new architecture brings a significant boost in performance. Many recent AI models were trained on Hopper, which was announced in 2022.
Blackwell and the GB200 represent the next generation beyond Hopper. The GB200 is much larger than the Hopper-based H100 chip.
Plans for the GB200
Amazon, Google, Microsoft and Oracle will offer access to the GB200 through their cloud services. Amazon said it would build a cluster with 20,000 GB200 chips.
The system can run a 27 trillion parameter model, far larger than models like GPT-4. Bigger models could enable new AI capabilities.
Nvidia did not reveal pricing details for the GB200 or systems built with it. But Hopper-based systems cost $200,000 or more.
Conclusion
The new Blackwell architecture and GB200 chip signify Nvidia’s continued push to be the leading AI platform. With Blackwell, the company aims to provide the power to train the next generation of enormous AI models.