
TOPICS & NEWS
The “Stargate Project,” a large-scale AI infrastructure development initiative spearheaded by SoftBank Group and OpenAI, is focusing on Sakai City as the central location for its expansion in Japan. Specifically, SoftBank plans to repurpose a former liquid crystal display panel factory owned by Sharp in Sakai. The company has acquired a portion of this facility for approximately 100 billion yen with the goal of transforming it into a cutting-edge AI data center.
This facility will be the third major site for the project, following an existing base in Tokyo and another under construction in Hokkaido. It boasts an impressive power capacity of 150 megawatts, making it one of the largest in Japan. Operations are slated to begin in 2026, with plans to expand capacity to 250 megawatts in the future. Sakai’s favorable location and infrastructure conditions are expected to ensure the long-term stability of the data center’s operations.
SB OpenAI Japan to Drive Domestic AI Development and Adoption
At the heart of this project is “SB OpenAI Japan,” a joint venture established in February 2025 by SoftBank and OpenAI. This company aims to develop large language models (LLMs) specifically tailored for the Japanese language and provide “Crystal Intelligence,” a generative AI service for businesses.
The Sakai data center is planned to host the operation of AI agents powered by GPUs, utilizing the foundational models provided by OpenAI. These agents will be specialized for various corporate functions, such as human resources and marketing, with the aim of delivering customized AI solutions that meet specific business needs.
These efforts have the potential to significantly accelerate the digital transformation of Japanese companies.
Creating the Future Through Massive Investment and Industrial Fusion
SoftBank is planning a large-scale development that will require 100,000 GPUs for this AI infrastructure build-out, potentially amounting to a massive investment approaching one trillion yen based on simple calculations. The GPUs are expected to be supplied by U.S.-based NVIDIA and the Stargate Project itself.
SoftBank President Miyakawa stated, “We aim to make Sakai a hub for the fusion of AI and existing industries, serving as an experimental ground for new business models and solutions to challenges.” This highlights the expectation that the facility will not just be a data center, but a key driver in the evolution of the AI industry both domestically and internationally.
Furthermore, this initiative is poised to be a crucial step in enhancing productivity across various industries and addressing labor shortages.
2025.04.30
In March 2025, U.S.-based NVIDIA held its annual developer conference, “GTC,” and announced its new software “Dynamo,” specifically designed for inference processing. This announcement comes against the backdrop of a significant shift in AI’s evolution, moving from a primary focus on “learning” to “inference.”
NVIDIA, a company that has historically excelled in technologies for training AI models, emphasized that its hardware and software are now essential for inference as well. CEO Jensen Huang stressed that accelerating inference processing is key to determining the quality of AI services.
Key Features of the New “Dynamo” Software
Dynamo will be available as open-source software and is designed to accelerate inference processing by efficiently coordinating multiple GPUs. When combined with the latest “Blackwell” GPU architecture, it can reportedly increase the processing speed of the “R1” AI model from the Chinese AI company DeepSeek by up to 30 times compared to previous methods.
A core feature is a technique called “fine-grained serving,” which significantly improves processing efficiency by separating the inference process into two phases: “prefill” and “decode,” and assigning them to different GPUs.
Furthermore, by leveraging a technology called “KV cache” to store and reuse past token information, Dynamo reduces computational load. The “KV Cache Manager” integrated into Dynamo enables efficient cache management to avoid exceeding GPU memory limits.
The Trade-off Problem and Hardware Evolution
In his keynote speech, CEO Huang highlighted the trade-off between “total tokens per second (throughput)” and “tokens per user (latency)” in inference. This illustrates the dilemma where faster response times can limit the number of concurrent users, while supporting more users can lead to increased response delays.
To address this, NVIDIA has adopted a strategy of overcoming this trade-off through hardware enhancements. The newly announced “Blackwell” architecture boasts up to 25 times the processing power of its predecessor, “Hopper,” enabling a balance between quality and scale.
Continued Strong Investment in AI-Related Data Centers
As the primary use case of AI shifts towards inference, the demand for computational processing is experiencing exponential growth. Following “Blackwell,” NVIDIA has unveiled development plans for even higher-performance GPUs, such as “Rubin” and “Feynman,” with Dynamo evolving as the corresponding software foundation.
To support such high-density and high-performance AI processing, distributed and large-scale computing environments are essential. Consequently, with the expansion of AI agents and generative AI, investment in data centers as the underlying infrastructure is expected to remain robust in the future.
2025.04.22
The importance of data centers is growing due to the need to process vast amounts of data resulting from the spread of digital devices, the development of self-driving vehicles, and the development and utilization of generative AI.
While approximately 80% of Japan’s major data center demand is concentrated in Tokyo and Osaka, it has been revealed that Asia Pacific Land (APL) Group, a U.S. real estate investment and development firm, plans to build large-scale data centers in Itoshima and Kitakyushu Cities, Fukuoka Prefecture.
Construction of Kyushu’s Largest Data Center in Itoshima City, with an Investment Exceeding 300 Billion Yen, Scheduled to Begin This Spring
Construction of one of Kyushu’s largest data centers is scheduled to begin in the Taku and Tomi districts of Itoshima City in the spring of 2025. This data center will have a total power receiving capacity of 300,000 kilowatts, and the investment amount will exceed 300 billion yen.
The location is in the southeastern part of the Maebaru Interchange on the Nishi-Kyushu Expressway.
The plan is to construct six data centers on a 122,000 square meter site.
Construction will begin with site preparation in the spring of 2025, and data center operations will gradually commence from 2029.
Construction of a 120,000 Kilowatt Data Center in Kitakyushu City, Aiming to Start by the Fall of 2027
In addition, APL Group acquired a 62,822 square meter city-owned site in the Kitakyushu Science and Research Park (Wakamatsu Ward, Kitakyushu City) in November 2023, and plans to invest 125 billion yen to build a data center with a total power receiving capacity of 120,000 kilowatts. The aim is to start construction by the fall of 2027.
This will be the second large-scale data center to be established in Kitakyushu City since 2007.
APL cited the proximity to submarine cable landing points and the future potential for renewable energy utilization as reasons for selecting Kitakyushu, taking into account its geographical proximity to Asia. They also expect to capture demand from domestic and East Asian companies.
Potential for Increased Attention as a Candidate for Decentralized Data Center Locations
The construction of data centers in Kyushu is aimed at decentralizing data centers as a risk hedge against various disasters, including the Nankai Trough earthquake, and also takes advantage of the proximity to submarine cable landing stations to Asia.
Kitakyushu City has proposed a “Backup Capital Concept” to serve as a hub for companies, data centers, and government agencies concentrated in Tokyo. The construction of a large-scale data center in the Kitakyushu Science and Research Park is likely to give momentum to the city’s concept. There is also the possibility that Kitakyushu, with its low disaster risk, will attract more attention as a candidate for decentralized data center locations, and expectations are high for its development.
2025.03.25
Prime Minister Shigeru Ishiba announced at the Digital Administrative Reform Conference in February the establishment of a public-private council to integrate the development of data centers and power plants, anticipating increased demand due to the spread of artificial intelligence (AI). This initiative aims to decentralize electricity and communication infrastructure, which are currently concentrated in urban areas.
The newly established public-private council will serve as a platform for discussing specific measures, with potential participation from Tokyo Electric Power Company Group, NTT, SoftBank Group, and others.
This concept of integrating data centers and power plants is known as “Watt-Bit Collaboration.” It envisions establishing data centers near power plants, such as nuclear, wind, and solar, to promote industrial clusters.
Focusing on the cost-effectiveness of fiber optic cables compared to power transmission lines, the plan aims to efficiently transmit digital information through optical cables, contributing to the development of a new power transmission and distribution network.
Data centers are currently concentrated in Tokyo and Osaka, with the Kanto and Kansai regions accounting for approximately 90% of the total site area as of 2023, according to the Ministry of Internal Affairs and Communications. Decentralizing electricity and communication infrastructure is essential from a national resilience perspective, including disaster response.
While this initiative aims to balance a smooth transition to decarbonization with the revitalization of regional economies, there are concerns regarding electricity challenges.
AI Power Capacity in Domestic Data Centers Expected to Increase Approximately 3.2 Times by 2028
IDC Japan Corporation released its estimated results of the power capacity required for AI servers installed in domestic data centers at the end of February. The total power capacity required by AI servers in domestic data centers is expected to increase from 67 megawatts at the end of 2024 to 212 megawatts by the end of 2028, an approximately 3.2-fold increase in four years. This is equivalent to about 5 to 8 hyper-scale data centers built in the Tokyo metropolitan area and Kansai region.
This power capacity refers to the power required by servers and does not include the power required by network equipment or cooling systems.
IDC Japan explained that the current estimate significantly revises the previous estimate (approximately 80 to 90 megawatts in 2027) made in January 2024. This revision is due to a substantial upward adjustment in the forecast for AI server shipment value in the domestic market.
The background includes the rapid expansion of AI server installations by hyper-scalers, as well as the acceleration of AI server procurement by domestic service providers and research institutions through government subsidy programs.
In particular, the scale of AI infrastructure investment by hyper-scalers is significant, with hyper-scale data centers accounting for the majority of the estimated power capacity.
AI servers are known for their high power consumption and heat generation per unit. Therefore, data centers that install a large number of AI servers require liquid cooling systems instead of conventional air conditioning systems.
Some experts believe that there are still many points to consider regarding the introduction of liquid cooling systems. Finding concrete solutions to these electricity challenges will be key to realizing the integrated AI infrastructure development.
2025.03.18
The rush to build data centers continues into 2025 as AI demand surges. Overseas companies are actively entering the Japanese market, with firms from the United States, Europe, Australia, New Zealand, China, Singapore, and more making their way in recent years. We’d like to share the news that two major Vietnamese IT companies will be joining them by the end of 2024.
Vietnam’s Largest IT Company, FPT, to Invest $200 Million in AI Data Center in Japan
Vietnam’s largest IT company, FPT, has announced plans to open an AI data center in Japan in 2025.
In an interview, FPT Chairman Truong Gia Binh stated that the company will invest $200 million (approximately 31 billion yen) in the first phase, along with multiple partner companies. While he did not disclose the partner companies, it has been revealed that SBI Holdings has agreed to invest in FPT.
According to Mr. Binh, the data center in Japan is expected to provide services to sectors such as automotive, manufacturing, retail, and healthcare.
Furthermore, FPT has revealed that it is in discussions with Sumitomo Corporation and NEC regarding AI collaboration, and is also considering potential joint ventures with Japanese telecommunications carriers.
CMC to Invest $500 Million in Data Center Expansion, Up to $100 Million in Japan
CMC, Vietnam’s second-largest IT company, plans to invest $500 million (approximately 75 billion yen) in infrastructure development, including data centers, over the next five years through 2028. The company aims to increase the capacity of its data centers in Vietnam tenfold and plans to spend up to $100 million in Japan.
CMC has data centers in three locations in Vietnam, including the capital city of Hanoi. The company plans to expand its power capacity, which indicates the scale of the data centers, from the current approximately 10 megawatts to a maximum of 100 megawatts by 2028. It also plans to establish smaller-scale data centers in Japan and other countries.
Regarding the development of data centers in Japan, the company is considering whether to own its facilities or use rentals. If it owns the facilities, the investment is expected to be around $100 million, while rentals would cost around $50 million.
Japan’s Data Center Market Driving Demand in the APAC Region
According to a report by research firm Knight Frank, the Tokyo data center market is valued at “2,575MW” and is said to be driving demand in the APAC region, further increasing Japan’s importance. It is expected that more and more foreign companies will enter the Japanese data center market in the future, and we look forward to the activities of domestic data center operators as well.
2025.02.26
In December 2024, the press conference between US President Trump and Masayoshi Son, President of telecommunications giant SoftBank Group, became a hot topic. Now, in 2025, SoftBank Group has announced another new project, which I will introduce here.
The Massive $500 Billion Investment “Stargate Project”
On January 22, 2025, SoftBank Group announced the “Stargate Project,” a joint project with OpenAI to build a large-scale AI infrastructure. The project aims to establish leadership in the AI field in the United States and contribute to the global economy.
In the Stargate Project, a huge investment of $500 billion will be made over the next four years, of which $100 billion will be invested immediately. This investment is expected to not only develop data centers and AI infrastructure, but also create hundreds of thousands of jobs in the United States, revitalize American industry, and even strengthen the national security of the United States and its allies.
In addition to SoftBank Group and OpenAI, Oracle and MGX, an artificial intelligence (AI) investment company based in Abu Dhabi, United Arab Emirates (UAE), are named as initial investors in the project. SoftBank Group will be responsible for financial management, OpenAI will be responsible for operations, and Masayoshi Son will serve as chairman. Key technology partners include Arm, Microsoft, NVIDIA, Oracle, and OpenAI, and these companies will work closely together to build and operate the computing system.
The construction of the AI infrastructure has already begun in Texas, and contracts for the construction of campuses are being signed at candidate sites across the United States. The Stargate Project is based on the cooperative relationship that OpenAI and NVIDIA have built since 2016, and the recent partnership between OpenAI and Oracle. It also leverages the existing partnership between OpenAI and Microsoft, aiming to expand the use of Azure while training leading models and providing high-quality products and services.
Investing in the AI and Semiconductor Industries
While Masayoshi Son, President of SoftBank Group, is showing his presence by announcing various AI-related investments such as AI data centers, the outline of AI and semiconductor industry support measures to be included in the comprehensive economic package to be compiled by the government within the month has been revealed in Japan. The government plans to provide more than 10 trillion yen in support by fiscal 2030, of which about 6 trillion yen will be allocated to subsidies for research and development of next-generation semiconductors, and more than 4 trillion yen will be allocated to financial support such as government investment and debt guarantees.
In addition, SoftBank Group and OpenAI established a joint venture company “SB Open AI Japan” on February 3, 2025. The company will develop and sell “Crystal Intelligence,” a generative AI (artificial intelligence) service for businesses, to support business efficiency. SoftBank Group will pay OpenAI approximately 450 billion yen annually for development and operation costs.
Given these circumstances, investments in data centers and the AI/semiconductor industry are expected to progress in Japan as well. Including the trends of SoftBank Group and OpenAI, it will be necessary to examine the results of these investments to see if they will contribute to the future economic growth of Japan and the world.
2025.02.14