- ALL
- CORPORATE
- BUSINESS
- OTHERS


For a sustainable future for data centers and digital infrastructure
Data center investment in Japan is booming, and demand is expected to continue to grow. On the other hand, we are facing the warning about the enormous power consumption. In addition, most data centers are concentrated in Tokyo and Osaka, and it is still difficult to realize the national policy, i.e. "optimal arrangement of data centers in Japan". We Digital Infrastructure Lab are here to support a project to develop and utilize data centers and * digital infrastructure for the stakeholders auch as land owners, data center operators, developers, investors, energy operators, data center related equipment manufacturers, government, local governments.
* Not limited to data centers, it covers the fields of e-commerce logistics facilities, digital-related laboratories, base stations / communication networks, and renewable energy facilities.
To realize "ideal" digital infrastructure investment
- ALL
- TOPICS & NEWS
- Founder Message
- ESG + DC
2025.04.30
Sakai Takes Center Stage as the Starting Point for Japan’s Stargate Project
The “Stargate Project,” a large-scale AI infrastructure development initiative spearheaded by SoftBank Group and OpenAI, is focusing on Sakai City as the central location for its expansion in Japan. Specifically, SoftBank plans to repurpose a former liquid crystal display panel factory owned by Sharp in Sakai. The company has acquired a portion of this facility for approximately 100 billion yen with the goal of transforming it into a cutting-edge AI data center.
This facility will be the third major site for the project, following an existing base in Tokyo and another under construction in Hokkaido. It boasts an impressive power capacity of 150 megawatts, making it one of the largest in Japan. Operations are slated to begin in 2026, with plans to expand capacity to 250 megawatts in the future. Sakai’s favorable location and infrastructure conditions are expected to ensure the long-term stability of the data center’s operations.
SB OpenAI Japan to Drive Domestic AI Development and Adoption
At the heart of this project is “SB OpenAI Japan,” a joint venture established in February 2025 by SoftBank and OpenAI. This company aims to develop large language models (LLMs) specifically tailored for the Japanese language and provide “Crystal Intelligence,” a generative AI service for businesses.
The Sakai data center is planned to host the operation of AI agents powered by GPUs, utilizing the foundational models provided by OpenAI. These agents will be specialized for various corporate functions, such as human resources and marketing, with the aim of delivering customized AI solutions that meet specific business needs.
These efforts have the potential to significantly accelerate the digital transformation of Japanese companies.
Creating the Future Through Massive Investment and Industrial Fusion
SoftBank is planning a large-scale development that will require 100,000 GPUs for this AI infrastructure build-out, potentially amounting to a massive investment approaching one trillion yen based on simple calculations. The GPUs are expected to be supplied by U.S.-based NVIDIA and the Stargate Project itself.
SoftBank President Miyakawa stated, “We aim to make Sakai a hub for the fusion of AI and existing industries, serving as an experimental ground for new business models and solutions to challenges.” This highlights the expectation that the facility will not just be a data center, but a key driver in the evolution of the AI industry both domestically and internationally.
Furthermore, this initiative is poised to be a crucial step in enhancing productivity across various industries and addressing labor shortages.
2025.04.22
The Evolution of AI and the Shift to the Inference Phase
In March 2025, U.S.-based NVIDIA held its annual developer conference, “GTC,” and announced its new software “Dynamo,” specifically designed for inference processing. This announcement comes against the backdrop of a significant shift in AI’s evolution, moving from a primary focus on “learning” to “inference.”
NVIDIA, a company that has historically excelled in technologies for training AI models, emphasized that its hardware and software are now essential for inference as well. CEO Jensen Huang stressed that accelerating inference processing is key to determining the quality of AI services.
Key Features of the New “Dynamo” Software
Dynamo will be available as open-source software and is designed to accelerate inference processing by efficiently coordinating multiple GPUs. When combined with the latest “Blackwell” GPU architecture, it can reportedly increase the processing speed of the “R1” AI model from the Chinese AI company DeepSeek by up to 30 times compared to previous methods.
A core feature is a technique called “fine-grained serving,” which significantly improves processing efficiency by separating the inference process into two phases: “prefill” and “decode,” and assigning them to different GPUs.
Furthermore, by leveraging a technology called “KV cache” to store and reuse past token information, Dynamo reduces computational load. The “KV Cache Manager” integrated into Dynamo enables efficient cache management to avoid exceeding GPU memory limits.
The Trade-off Problem and Hardware Evolution
In his keynote speech, CEO Huang highlighted the trade-off between “total tokens per second (throughput)” and “tokens per user (latency)” in inference. This illustrates the dilemma where faster response times can limit the number of concurrent users, while supporting more users can lead to increased response delays.
To address this, NVIDIA has adopted a strategy of overcoming this trade-off through hardware enhancements. The newly announced “Blackwell” architecture boasts up to 25 times the processing power of its predecessor, “Hopper,” enabling a balance between quality and scale.
Continued Strong Investment in AI-Related Data Centers
As the primary use case of AI shifts towards inference, the demand for computational processing is experiencing exponential growth. Following “Blackwell,” NVIDIA has unveiled development plans for even higher-performance GPUs, such as “Rubin” and “Feynman,” with Dynamo evolving as the corresponding software foundation.
To support such high-density and high-performance AI processing, distributed and large-scale computing environments are essential. Consequently, with the expansion of AI agents and generative AI, investment in data centers as the underlying infrastructure is expected to remain robust in the future.
2025.03.25
U.S. Real Estate Investment and Development Firm APL Group Plans to Build Large-Scale Data Centers in Itoshima and Kitakyushu Cities
The importance of data centers is growing due to the need to process vast amounts of data resulting from the spread of digital devices, the development of self-driving vehicles, and the development and utilization of generative AI.
While approximately 80% of Japan’s major data center demand is concentrated in Tokyo and Osaka, it has been revealed that Asia Pacific Land (APL) Group, a U.S. real estate investment and development firm, plans to build large-scale data centers in Itoshima and Kitakyushu Cities, Fukuoka Prefecture.
Construction of Kyushu’s Largest Data Center in Itoshima City, with an Investment Exceeding 300 Billion Yen, Scheduled to Begin This Spring
Construction of one of Kyushu’s largest data centers is scheduled to begin in the Taku and Tomi districts of Itoshima City in the spring of 2025. This data center will have a total power receiving capacity of 300,000 kilowatts, and the investment amount will exceed 300 billion yen.
The location is in the southeastern part of the Maebaru Interchange on the Nishi-Kyushu Expressway.
The plan is to construct six data centers on a 122,000 square meter site.
Construction will begin with site preparation in the spring of 2025, and data center operations will gradually commence from 2029.
Construction of a 120,000 Kilowatt Data Center in Kitakyushu City, Aiming to Start by the Fall of 2027
In addition, APL Group acquired a 62,822 square meter city-owned site in the Kitakyushu Science and Research Park (Wakamatsu Ward, Kitakyushu City) in November 2023, and plans to invest 125 billion yen to build a data center with a total power receiving capacity of 120,000 kilowatts. The aim is to start construction by the fall of 2027.
This will be the second large-scale data center to be established in Kitakyushu City since 2007.
APL cited the proximity to submarine cable landing points and the future potential for renewable energy utilization as reasons for selecting Kitakyushu, taking into account its geographical proximity to Asia. They also expect to capture demand from domestic and East Asian companies.
Potential for Increased Attention as a Candidate for Decentralized Data Center Locations
The construction of data centers in Kyushu is aimed at decentralizing data centers as a risk hedge against various disasters, including the Nankai Trough earthquake, and also takes advantage of the proximity to submarine cable landing stations to Asia.
Kitakyushu City has proposed a “Backup Capital Concept” to serve as a hub for companies, data centers, and government agencies concentrated in Tokyo. The construction of a large-scale data center in the Kitakyushu Science and Research Park is likely to give momentum to the city’s concept. There is also the possibility that Kitakyushu, with its low disaster risk, will attract more attention as a candidate for decentralized data center locations, and expectations are high for its development.
2025.03.18
Ishiba Cabinet Announces Establishment of Council for Integrated AI Infrastructure Development, Aiming to Balance Decarbonization and Regional Revitalization While Addressing Electricity Concerns
Prime Minister Shigeru Ishiba announced at the Digital Administrative Reform Conference in February the establishment of a public-private council to integrate the development of data centers and power plants, anticipating increased demand due to the spread of artificial intelligence (AI). This initiative aims to decentralize electricity and communication infrastructure, which are currently concentrated in urban areas.
The newly established public-private council will serve as a platform for discussing specific measures, with potential participation from Tokyo Electric Power Company Group, NTT, SoftBank Group, and others.
This concept of integrating data centers and power plants is known as “Watt-Bit Collaboration.” It envisions establishing data centers near power plants, such as nuclear, wind, and solar, to promote industrial clusters.
Focusing on the cost-effectiveness of fiber optic cables compared to power transmission lines, the plan aims to efficiently transmit digital information through optical cables, contributing to the development of a new power transmission and distribution network.
Data centers are currently concentrated in Tokyo and Osaka, with the Kanto and Kansai regions accounting for approximately 90% of the total site area as of 2023, according to the Ministry of Internal Affairs and Communications. Decentralizing electricity and communication infrastructure is essential from a national resilience perspective, including disaster response.
While this initiative aims to balance a smooth transition to decarbonization with the revitalization of regional economies, there are concerns regarding electricity challenges.
AI Power Capacity in Domestic Data Centers Expected to Increase Approximately 3.2 Times by 2028
IDC Japan Corporation released its estimated results of the power capacity required for AI servers installed in domestic data centers at the end of February. The total power capacity required by AI servers in domestic data centers is expected to increase from 67 megawatts at the end of 2024 to 212 megawatts by the end of 2028, an approximately 3.2-fold increase in four years. This is equivalent to about 5 to 8 hyper-scale data centers built in the Tokyo metropolitan area and Kansai region.
This power capacity refers to the power required by servers and does not include the power required by network equipment or cooling systems.
IDC Japan explained that the current estimate significantly revises the previous estimate (approximately 80 to 90 megawatts in 2027) made in January 2024. This revision is due to a substantial upward adjustment in the forecast for AI server shipment value in the domestic market.
The background includes the rapid expansion of AI server installations by hyper-scalers, as well as the acceleration of AI server procurement by domestic service providers and research institutions through government subsidy programs.
In particular, the scale of AI infrastructure investment by hyper-scalers is significant, with hyper-scale data centers accounting for the majority of the estimated power capacity.
AI servers are known for their high power consumption and heat generation per unit. Therefore, data centers that install a large number of AI servers require liquid cooling systems instead of conventional air conditioning systems.
Some experts believe that there are still many points to consider regarding the introduction of liquid cooling systems. Finding concrete solutions to these electricity challenges will be key to realizing the integrated AI infrastructure development.
2023.07.05
Google warns Irish government moratorium on data center development
Irish government restricts data center development
Ireland’s The Commission for Regulation of Utilities (CRU) has decided to limit the impact by imposing a de facto moratorium on new data center development in the Dublin metropolitan area.
Ireland’s national transmission operator EirGrid said in response that it would only consider new applications for grid connection on a case-by-case basis. The restrictions could reportedly last until 2028.
Martin Shanahan, CEO of Ireland’s Industrial Development Authority (IDA), recently said that new data centers “are unlikely to occur in Dublin and the East Coast at this time.”
Google has asked such Irish regulators not to impose a moratorium on data center development in the country.
In The Commission for Regulation of Utilities (CRU) filing, the company said search and cloud companies must “absolutely” avoid a moratorium on data center development.
Google said such a ban would send a “wrong signal” about Ireland’s digital economy ambitions, and would affect the country’s infrastructure, according to a Freedom of Information request first reported by The Irish Times. It adds that it makes further investment “impossible”.
In the filing, Google called for more transparency about where the Irish network has existing power capacity, as well as being clearer and more open about EirGrid’s projections of data center power usage growth. I think you need to.
Growing Demand for Cloud Computing, Google’s Proposal
Google, which launched its first data center in Ireland in 2012, has proposed a new pricing structure for data center operators who reserve more capacity than they ultimately need or grow to that capacity too slowly. bottom.
“Transmission tariffs can be designed so that consumers who are not seeing increased demand towards maximum reserved capacity will be charged more than consumers who are demonstrating an increase each year.” says.
EirGrid and politicians have previously suggested moving data center development to the west of Ireland (away from Dublin’s constrained areas and closer to renewable energy sources), but Google says this is not a viable solution. I point out that it is not.
“The demand for cloud computing in Dublin is growing. We are unable to provide services.”
Another AWS filing says Ireland has missed opportunities in the past to address supply issues.
“Over the past decade, we have had opportunities to do reinforcement work, prepare the grid for growth and investment, and prepare the grid for more intermittent integration of resources,” he said.
Both the Social Democrats and the People Before Profit parties have been calling for a nationwide moratorium on future data center projects for the past 12 months. The PBP bill was an absolute ban on data centers, liquid natural gas plants and new fossil fuel infrastructure.
In Dublin last month, South Dublin County Council (SDCC) voted to block future data center construction in the county as part of a new development plan.
What is the background behind the Irish government’s moratorium on data center development?
Irish Government Behind Data Center Development Moratorium
The Irish government’s achievement of emissions and renewable energy targets is behind this.
According to EirGrid, data center energy usage is projected to increase by 9TWh by 2030, ranging from 23% to 31% of Ireland’s grid supply in 2030. This comes at a time when the government wants to reduce emissions by 60-80% by increasing the share of renewable energy. At the same time, governments want to decarbonise by moving heating and transportation to electricity, further increasing demand on the grid.
According to The Irish Times, EirGrid has agreed to connect an additional 1.8GW of data centers to the grid, with current peak demand of around 5GW, and a further 2GW of applications ready. That’s it.
The Government Statement on the Role of Data Centers in Ireland’s Enterprise Strategy 2018, published in 2018, emphasized the positive role of data centers in the country’s economic performance. However, it will now be “aligned with sectoral emissions caps and renewable energy targets, concerns about continued security of supply, and demand flexibility measures currently needed. In order to secure it, it will be reviewed. “In addition, further tightening of regulations will be considered,” it is reported.
Will it work or will it backfire?
The Irish government imposes a moratorium on data center development, which is in high demand worldwide. It seems that the moratorium continues while receiving a warning from Google. Will this decision work or will it backfire? We will keep an eye on trends.
2023.03.26
Data center facility inspection robots to be fully deployed from April 2023 (NTT DATA)
NTT DATA Co., Ltd. is working to remote/automate equipment inspection work using robots at the data center “NTT Shinagawa TWINS DATA Building” (hereinafter referred to as “Shinagawa Data Center”) operated by the company. announced that it has confirmed that it is possible to reduce the equipment inspection work that was previously done by about 50%.
From April 2023, NTT DATA will proceed with the introduction of robots to data centers nationwide.
Background of robot introduction
NTT DATA explained that the building management industry, including data centers, is facing a serious manpower shortage, and that facility management work, in particular, is facing a shortage of skilled workers, and that there is a need for labor savings and more efficient work implementation.
Among facility management operations, the company believes that inspection work is highly effective in reducing manpower and that remote/automated operations are feasible through the use of digital technology, and has been conducting verification for practical application at its Shinagawa Data Center.
Overview of Robot Introduction and Changing Checking Tasks
In this initiative, a robot automatically patrols a predetermined inspection route, taking pictures of meters, lamps, and facility exterior, and acquiring environmental data such as odors using sensors, thereby replacing the work of measuring meters, checking lamps, and checking for abnormalities in appearance and unusual odors that had previously been performed by humans.
In this method, a single camera or sensor can be used to inspect multiple locations, and there is no need to modify the current equipment in operation, making it cheaper and simpler to achieve remote/automated operation than other methods such as installing IoT cameras and sensors for each inspection target or converting to smart meters.
The robot used in this project is a next-generation avatar robot “ugo Pro” modified for facility inspection work in collaboration with ugo Corporation, a manufacturer of business DX robots.
In order to capture detailed meter readings, the robot is equipped with a 4K camera with higher image quality than the standard model, and multiple devices such as an odor sensor, microphone, and thermo camera can be mounted on the ugo itself to expand its applications depending on the inspection items.
The robot can be operated using only a PC, and its travel route can be set with no code, making it easy for on-site personnel to use the robot. The robot can switch between automatic traveling and remote control, and can be used not only for automatic inspection work, but also for multiple applications, such as work support from a distance.
These features not only allow the robot to handle a variety of inspection items, but also to expand its applications to include remote work support and construction attendance.
By using robots and sensors to remotely/automatically perform inspection work, not only can work hours be reduced, but also the threshold values for determining abnormalities, which used to rely on human senses, can be quantified to enable detection of abnormalities without relying on skilled workers.
In addition, by making it possible to remotely perform tasks that could only be performed onsite, including work support and construction attendance, it is expected to support flexible work styles and secure new workers.
About the future
In the future, NTT DATA aims to expand the scope of automation to include recording and reporting work that currently requires personnel to perform, and to reduce the time required for inspection work by up to 80% by promoting linkage with meter reading systems and abnormality detection AI.
NTT Data will also work to enhance facility management operations, such as advanced abnormality detection and predictive maintenance of facilities, utilizing data acquired by robots and sensors.
Starting in April 2023, the initiative will be rolled out sequentially to 15 data centers nationwide.
Furthermore, based on the knowledge gained from these efforts, the company aims to offer the service commercially as a remote/automated service for facility inspection operations by the end of FY2023.
For commercial provision, ugo will utilize the new robot “ugo mini” developed by making use of the knowledge obtained through joint verification with NTT DATA to develop remote/automation solutions for facility management operations, from consulting for introduction. NTT DATA provide one-stop support from system construction to operation to solve customer problems.
The day of full-scale deployment of robots for facility inspection operations at data centers is eagerly awaited to help resolve the serious labor shortage.
2023.03.11
Announced the start of construction of “Zero Emission Data Center” planned in Ishikari City, Hokkaido (KCCS)
On November 24, 2022, Kyocera Communication Systems Corporation (KCCS) announced that KCCS will begin construction of a zero-emission data center in Ishikari City, Hokkaido, Japan, in December 2022, with the data center scheduled to open in the fall of 2024.
In 2019, KCCS announced plans for a zero-emission data center in Ishikari, Hokkaido, which will operate on 100% renewable energy.
Subsequently, due to a change in the originally planned baseload power supply plan, the power supply configuration and data center design were revised, and now the company has announced the start of construction and opening schedule.
The data center to be constructed will be located in the Ishikari Bay New Port area of Ishikari City, Hokkaido, with a site area of approximately 15,000 square meters, total floor space of approximately 5,300 square meters (at the time of opening), and 400 racks (at the time of opening).
Toward Achieving Carbon Neutrality by 2050
In Japan, local production and local consumption of renewable energy is an important theme for achieving carbon neutrality (virtually zero greenhouse gas emissions) by 2050, as is the decentralization of data centers in the “Digital Rural City State Concept” being promoted by the government. The introduction of “real renewable energy,” which reduces environmental impact to plus or minus zero through the purchase of environmental values such as non-fossil certificates, is progressing.
To this end, expansion of “direct use of renewable energy” is also needed to further increase the amount of renewable energy introduced.
However, it is not easy to achieve “direct use of renewable energy” in large-scale demand facilities such as data centers, as securing stable renewable energy power and economic efficiency is a challenge.
Ishikari City has been selected as a “Decarbonization Leading Region (1st round)” by the Ministry of the Environment in a publicly solicited project to achieve carbon neutrality by 2050.
In addition, KCCS has formulated the “Redesigning the Region through Local Production of Renewable Energy and Decarbonization,” a measure aimed at zero carbon, and is aiming for a decarbonized industrial cluster by supplying renewable energy to the data center cluster and surrounding facilities in the Ishikari Bay New Port area.
The zero-emission data center will utilize the abundant renewable energy sources in the region, and a new solar power plant owned by KCCS will be built in the vicinity of the data center to directly utilize those renewable energy sources.
In addition, in order to operate the data center while simultaneously ensuring the “reliability,” “environmental friendliness,” and “economic efficiency” of multiple renewable energy sources, KCCS will build its own power supply and demand control mechanism utilizing storage batteries and AI technology.
KCCS aims to demonstrate the possibility of local production for local consumption of renewable energy through the “data center business operated on 100% renewable energy” in Ishikari City, as well as to contribute to regional revitalization through decentralized data storage in Japan and the creation of jobs for data center technicians and energy-related engineers. The project also aims to contribute to regional revitalization by creating jobs for data center technicians and energy-related engineers.
Expectations are high for the opening of a “zero-emission data center” to achieve carbon neutrality by 2050.
2023.03.01
Commitment to be water positive by 2030 (AWS)
What is Water Positive
Water positive means providing more water than you consume. With freshwater shortages becoming an issue around the world, companies are making various efforts to secure water.
There are two main ways to do it: either reduce consumption or increase supply.
There are ways to reduce water consumption, such as water conservation and recycling, and ways to increase water supply, such as investing in areas and businesses with high water stress, such as water scarcity and water pollution.
AWS Committed to Reducing Water Use in Data Centers
Amazon Web Services (AWS) is the new hyperscaler that has committed to making their business water positive.
At the AWS re:Invent event held in Las Vegas, the company announced a policy to achieve water positive by 2030, returning more water to the community than it uses directly in its operations.
AWS CEO Adam Selipsky said:
“Water scarcity is a major problem around the world, and with today’s announcement by Water Positive, we are committed to doing our part to help solve this fast-growing problem.
To ensure universal access to water, we need to develop new ways to conserve and reuse this precious resource. While we are proud of what we have achieved so far, we also believe that more can be done.
We are committed to leading water stewardship in our cloud business and giving back more water than we use in the communities in which we operate. We believe this is the right thing to do for the environment and our customers. ”
The company’s efforts to achieve this goal include: analyzing water usage in real time, using IoT technology to identify and fix leaks, using recycled or rainwater for cooling, and reusing water multiple times. It includes replenishment as well as reimbursement activities, including the availability of on-site water treatment systems, and where possible, funding for waterless cooling and various water replenishment activities at the facility.
In 2021, AWS said it achieved a global Water Use Efficiency (WUE) index of 0.25L water per kWh. In Ireland and Sweden, AWS says it doesn’t use water to cool its data centers 95% of the year.
According to a US Department of Energy report, the average evaporative cooling data center WUE is 1.8L per kWh.
In the UK, AWS is working with The Rivers Trust and Action for the River Kennet to create two wetlands on tributaries of the River Thames.
“England’s rivers are national treasures and we are delighted to partner with AWS and work with our member trusts here to protect the Thames and its tributaries,” said Mark Lloyd, CEO of The Rivers Trust. said.
“AWS’ commitment to be water positive by 2030 drives the actions needed to help restore rivers and water resources impacted by climate change.
We look forward to expanding our relationship with AWS and using this partnership to demonstrate similar avenues for other companies to jointly support water management activities that improve the resilience of rivers. ”
Data centers use a lot of water for cooling, but it’s not clear how much water the industry actually uses.
Researchers estimate that on average in the US, 1 MWh of data center energy consumption requires 7.1 cubic meters of water, but this can vary widely by region and facility.
Efforts of Google, Microsoft and Meta
Google, Meta and Microsoft have committed to being water positive by 2030, but many of their facilities now use millions of gallons of water per day.
Colocation and peering service provider CyrusOne, which owns and operates over 40 carrier-neutral data centers in North America, Europe and South America, claims several of its facilities are water positive.
Morningstar Sustainalytics, a leading ESG research, ratings, and data provider that has helped investors around the world develop and implement responsible investment strategies, previously released a report showing that Microsoft is leading the market in water conservation efforts.
European data center operators pledged to the European Commission earlier this year to reduce water usage to up to 400ml per kWh of computer power by 2040.
Due to the effects of global warming and population growth, water shortages are becoming a problem around the world. We will continue to keep an eye on the water positive initiatives of major companies.
2022.09.05
2022.09.05
2022.09.05
2022.09.05
2025.04.30
Sakai Takes Center Stage as the Starting Point for Japan’s Stargate Project
The “Stargate Project,” a large-scale AI infrastructure development initiative spearheaded by SoftBank Group and OpenAI, is focusing on Sakai City as the central location for its expansion in Japan. Specifically, SoftBank plans to repurpose a former liquid crystal display panel factory owned by Sharp in Sakai. The company has acquired a portion of this facility for approximately 100 billion yen with the goal of transforming it into a cutting-edge AI data center.
This facility will be the third major site for the project, following an existing base in Tokyo and another under construction in Hokkaido. It boasts an impressive power capacity of 150 megawatts, making it one of the largest in Japan. Operations are slated to begin in 2026, with plans to expand capacity to 250 megawatts in the future. Sakai’s favorable location and infrastructure conditions are expected to ensure the long-term stability of the data center’s operations.
SB OpenAI Japan to Drive Domestic AI Development and Adoption
At the heart of this project is “SB OpenAI Japan,” a joint venture established in February 2025 by SoftBank and OpenAI. This company aims to develop large language models (LLMs) specifically tailored for the Japanese language and provide “Crystal Intelligence,” a generative AI service for businesses.
The Sakai data center is planned to host the operation of AI agents powered by GPUs, utilizing the foundational models provided by OpenAI. These agents will be specialized for various corporate functions, such as human resources and marketing, with the aim of delivering customized AI solutions that meet specific business needs.
These efforts have the potential to significantly accelerate the digital transformation of Japanese companies.
Creating the Future Through Massive Investment and Industrial Fusion
SoftBank is planning a large-scale development that will require 100,000 GPUs for this AI infrastructure build-out, potentially amounting to a massive investment approaching one trillion yen based on simple calculations. The GPUs are expected to be supplied by U.S.-based NVIDIA and the Stargate Project itself.
SoftBank President Miyakawa stated, “We aim to make Sakai a hub for the fusion of AI and existing industries, serving as an experimental ground for new business models and solutions to challenges.” This highlights the expectation that the facility will not just be a data center, but a key driver in the evolution of the AI industry both domestically and internationally.
Furthermore, this initiative is poised to be a crucial step in enhancing productivity across various industries and addressing labor shortages.
2025.04.22
The Evolution of AI and the Shift to the Inference Phase
In March 2025, U.S.-based NVIDIA held its annual developer conference, “GTC,” and announced its new software “Dynamo,” specifically designed for inference processing. This announcement comes against the backdrop of a significant shift in AI’s evolution, moving from a primary focus on “learning” to “inference.”
NVIDIA, a company that has historically excelled in technologies for training AI models, emphasized that its hardware and software are now essential for inference as well. CEO Jensen Huang stressed that accelerating inference processing is key to determining the quality of AI services.
Key Features of the New “Dynamo” Software
Dynamo will be available as open-source software and is designed to accelerate inference processing by efficiently coordinating multiple GPUs. When combined with the latest “Blackwell” GPU architecture, it can reportedly increase the processing speed of the “R1” AI model from the Chinese AI company DeepSeek by up to 30 times compared to previous methods.
A core feature is a technique called “fine-grained serving,” which significantly improves processing efficiency by separating the inference process into two phases: “prefill” and “decode,” and assigning them to different GPUs.
Furthermore, by leveraging a technology called “KV cache” to store and reuse past token information, Dynamo reduces computational load. The “KV Cache Manager” integrated into Dynamo enables efficient cache management to avoid exceeding GPU memory limits.
The Trade-off Problem and Hardware Evolution
In his keynote speech, CEO Huang highlighted the trade-off between “total tokens per second (throughput)” and “tokens per user (latency)” in inference. This illustrates the dilemma where faster response times can limit the number of concurrent users, while supporting more users can lead to increased response delays.
To address this, NVIDIA has adopted a strategy of overcoming this trade-off through hardware enhancements. The newly announced “Blackwell” architecture boasts up to 25 times the processing power of its predecessor, “Hopper,” enabling a balance between quality and scale.
Continued Strong Investment in AI-Related Data Centers
As the primary use case of AI shifts towards inference, the demand for computational processing is experiencing exponential growth. Following “Blackwell,” NVIDIA has unveiled development plans for even higher-performance GPUs, such as “Rubin” and “Feynman,” with Dynamo evolving as the corresponding software foundation.
To support such high-density and high-performance AI processing, distributed and large-scale computing environments are essential. Consequently, with the expansion of AI agents and generative AI, investment in data centers as the underlying infrastructure is expected to remain robust in the future.
2025.03.25
U.S. Real Estate Investment and Development Firm APL Group Plans to Build Large-Scale Data Centers in Itoshima and Kitakyushu Cities
The importance of data centers is growing due to the need to process vast amounts of data resulting from the spread of digital devices, the development of self-driving vehicles, and the development and utilization of generative AI.
While approximately 80% of Japan’s major data center demand is concentrated in Tokyo and Osaka, it has been revealed that Asia Pacific Land (APL) Group, a U.S. real estate investment and development firm, plans to build large-scale data centers in Itoshima and Kitakyushu Cities, Fukuoka Prefecture.
Construction of Kyushu’s Largest Data Center in Itoshima City, with an Investment Exceeding 300 Billion Yen, Scheduled to Begin This Spring
Construction of one of Kyushu’s largest data centers is scheduled to begin in the Taku and Tomi districts of Itoshima City in the spring of 2025. This data center will have a total power receiving capacity of 300,000 kilowatts, and the investment amount will exceed 300 billion yen.
The location is in the southeastern part of the Maebaru Interchange on the Nishi-Kyushu Expressway.
The plan is to construct six data centers on a 122,000 square meter site.
Construction will begin with site preparation in the spring of 2025, and data center operations will gradually commence from 2029.
Construction of a 120,000 Kilowatt Data Center in Kitakyushu City, Aiming to Start by the Fall of 2027
In addition, APL Group acquired a 62,822 square meter city-owned site in the Kitakyushu Science and Research Park (Wakamatsu Ward, Kitakyushu City) in November 2023, and plans to invest 125 billion yen to build a data center with a total power receiving capacity of 120,000 kilowatts. The aim is to start construction by the fall of 2027.
This will be the second large-scale data center to be established in Kitakyushu City since 2007.
APL cited the proximity to submarine cable landing points and the future potential for renewable energy utilization as reasons for selecting Kitakyushu, taking into account its geographical proximity to Asia. They also expect to capture demand from domestic and East Asian companies.
Potential for Increased Attention as a Candidate for Decentralized Data Center Locations
The construction of data centers in Kyushu is aimed at decentralizing data centers as a risk hedge against various disasters, including the Nankai Trough earthquake, and also takes advantage of the proximity to submarine cable landing stations to Asia.
Kitakyushu City has proposed a “Backup Capital Concept” to serve as a hub for companies, data centers, and government agencies concentrated in Tokyo. The construction of a large-scale data center in the Kitakyushu Science and Research Park is likely to give momentum to the city’s concept. There is also the possibility that Kitakyushu, with its low disaster risk, will attract more attention as a candidate for decentralized data center locations, and expectations are high for its development.
2025.03.18
Ishiba Cabinet Announces Establishment of Council for Integrated AI Infrastructure Development, Aiming to Balance Decarbonization and Regional Revitalization While Addressing Electricity Concerns
Prime Minister Shigeru Ishiba announced at the Digital Administrative Reform Conference in February the establishment of a public-private council to integrate the development of data centers and power plants, anticipating increased demand due to the spread of artificial intelligence (AI). This initiative aims to decentralize electricity and communication infrastructure, which are currently concentrated in urban areas.
The newly established public-private council will serve as a platform for discussing specific measures, with potential participation from Tokyo Electric Power Company Group, NTT, SoftBank Group, and others.
This concept of integrating data centers and power plants is known as “Watt-Bit Collaboration.” It envisions establishing data centers near power plants, such as nuclear, wind, and solar, to promote industrial clusters.
Focusing on the cost-effectiveness of fiber optic cables compared to power transmission lines, the plan aims to efficiently transmit digital information through optical cables, contributing to the development of a new power transmission and distribution network.
Data centers are currently concentrated in Tokyo and Osaka, with the Kanto and Kansai regions accounting for approximately 90% of the total site area as of 2023, according to the Ministry of Internal Affairs and Communications. Decentralizing electricity and communication infrastructure is essential from a national resilience perspective, including disaster response.
While this initiative aims to balance a smooth transition to decarbonization with the revitalization of regional economies, there are concerns regarding electricity challenges.
AI Power Capacity in Domestic Data Centers Expected to Increase Approximately 3.2 Times by 2028
IDC Japan Corporation released its estimated results of the power capacity required for AI servers installed in domestic data centers at the end of February. The total power capacity required by AI servers in domestic data centers is expected to increase from 67 megawatts at the end of 2024 to 212 megawatts by the end of 2028, an approximately 3.2-fold increase in four years. This is equivalent to about 5 to 8 hyper-scale data centers built in the Tokyo metropolitan area and Kansai region.
This power capacity refers to the power required by servers and does not include the power required by network equipment or cooling systems.
IDC Japan explained that the current estimate significantly revises the previous estimate (approximately 80 to 90 megawatts in 2027) made in January 2024. This revision is due to a substantial upward adjustment in the forecast for AI server shipment value in the domestic market.
The background includes the rapid expansion of AI server installations by hyper-scalers, as well as the acceleration of AI server procurement by domestic service providers and research institutions through government subsidy programs.
In particular, the scale of AI infrastructure investment by hyper-scalers is significant, with hyper-scale data centers accounting for the majority of the estimated power capacity.
AI servers are known for their high power consumption and heat generation per unit. Therefore, data centers that install a large number of AI servers require liquid cooling systems instead of conventional air conditioning systems.
Some experts believe that there are still many points to consider regarding the introduction of liquid cooling systems. Finding concrete solutions to these electricity challenges will be key to realizing the integrated AI infrastructure development.