
As detailed at its 39th Annual General Meeting of Shareholders, AI computing infrastructure is one of SoftBank Corp.'s (TOKYO: 9434) major growth initiatives, and a key part of its strategy to provide next-generation social infrastructure for Japan. On July 23, 2025, SoftBank announced a major new development related to this initiative.
On July 22, it deployed a DGX SuperPOD with DGX B200 systems with more than 4,000 NVIDIA Blackwell graphic processing units (GPUs). That means SoftBank's AI computing platform is now the world's largest NVIDIA DGX SuperPOD with DGX B200 systems*. The AI computing platform built and deployed by SoftBank also now exceeds 10,000 GPUs in total, delivering a combined computing capability of 13.7 Exaflops—a measure of performance for a supercomputer.
- *
According to SoftBank research as of July 23, SoftBank’s platform is the largest among currently operating AI computing platforms based on NVIDIA DGX SuperPOD built with NVIDIA Blackwell GPUs.
How will this state-of-the-art AI computing platform be used? SB Intuitions Corp., SoftBank's subsidiary dedicated to developing homegrown large language models (LLMs) specialized in the Japanese language, will initially utilize the platform. SB Intuitions built LLMs with approximately 460 billion parameters in the fiscal year 2024, and it plans to offer a commercial model called “Sarashina mini” with 70 billion parameters within the current 2025 fiscal year that ends on March 31, 2026.
By fully leveraging the newly enhanced AI computing platform, and continuously training multiple high-performance Sarashina mini models, SoftBank is aiming to accelerate the development of even larger and more advanced models in the future. In addition to utilizing the AI computing platform among its own group companies, SoftBank plans to provide the infrastructure as a service to companies and research institutions in Japan.
For more information, see this press release.
Related Articles
(Posted on July 24, 2025)
by SoftBank News Editors


