TOPICS & NEWS
During a recent trade mission from Ireland to South Korea, a contract was signed between Irish development company LLUMCLOON Energy and a construction subsidiary of SK Group.
According to the Irish Times, the two companies are planning a “fuel cell-powered data center” in Ireland that will not be connected to the electricity grid and will use gas fuel cells.
What is a fuel cell?
A fuel cell is a device that uses energy to generate electricity.
A chemical reaction between a vehicle’s fuel (usually hydrogen) and an oxidizing agent such as oxygen. In particular, solid oxide fuel cells (SOFCs) operate at high temperatures and are highly efficient, but typically use hydrocarbon fuels such as natural gas.
This chemical energy is converted into electrical energy, which is expected to transition to hydrogen in the future as more environmentally friendly fuel sources advance.
SK plant advancing fuel cell development
SK Ecoplant, formerly known as SK E&C, is a construction subsidiary of the South Korean conglomerate SK Group, which owns companies such as SK Telecom and SK Hynix. In collaboration with Bloom Energy, based in San Jose, the company is currently advancing the development of fuel cell and hydrogen power generation facilities.
Ther are not only providing crucial technology but also taking on a comprehensive role in the construction of data centers.
The Potential of Fuel Cells in Data Centers
The demand for data centers is increasing. The energy consumption, strain on power grids, and carbon dioxide emissions from data centers have been the subject of much political debate in recent years.
While details regarding the development schedule and operational capacity are yet to be disclosed, this initiative represents a significant step forward for future green energy solutions in Ireland.
Major global tech giants such as Microsoft and Amazon are also exploring fuel cell applications to power their data centers, signaling an industry-wide shift towards more sustainable backup and primary power sources.
The success of the plan could potentially set a benchmark for the future development of fuel cells in data centers across Europe, Japan, and other regions.
2023.11.15
In July of this year, Microsoft announced a new policy. While Microsoft’s business data centers for AI generation were primarily located in the United States and Europe, in order to enhance service quality for Japanese customers and strengthen data management, they revealed the decision to switch all business data centers for AI generation applications of Japanese companies to be within Japan.
The background to this policy is that concerns have been raised regarding the management of sensitive and important information due to data center locations overseas.In a bid to address these concerns, Microsoft decided to conduct all data transactions within Japan.
Expanding the existing data centers in Eastern Japan to serve as hubs for AI generation will enable the processing of highly sensitive information solely within the country. Moreover, in line with the expansion of AI generation operations, Microsoft plans to consider expanding data centers located in Western Japan in the future.
Microsoft’s actions are impacting the entire industry, as seen in NEC’s initiation of new services utilizing data centers within Japan.
Headquarters for the Promotion of Digital Society also recognizes
Microsoft’s new policy has been recognized by the LDP’s Headquarters for the Promotion of Digital Society.
At a meeting of the working team on the use of AI, former Digital Minister Hirai, who was briefed on the new policy by Microsoft, expressed his view that “research and development is progressing rapidly in countries around the world, and Japan will have more promising options in terms of providing an environment.
Furthermore, Masaaki Taira, a member of the House of Representatives who chairs the working team, said, “If data centers are located overseas, security issues will arise, so the establishment of a base in Japan is an important proposal that will resolve one of our concerns.
The expansion of domestic data centers will resolve the biggest concerns regarding the use of generated AI by the government and various ministries. The announcement of this new policy is likely to increase expectations for the future development of generative AI.
2023.10.28
In this issue, we explore NVIDIA, an American semiconductor manufacturer that has been in the news since its market capitalization briefly reached the $1 trillion mark at the end of May this year.
About GPUs manufactured by NVIDIA
NVIDIA manufactures GPUs (image processing semiconductors), which are chips designed for display functions such as video, image, and animation display, and have been used in gaming PCs to display images smoothly. In recent years, GPUs have come into the limelight as the bearers of advanced arithmetic processing in automated driving technology and cryptographic asset mining operations.
Now, demand for these GPUs is growing. The catalysts for this are “data centers” and “generative AI” represented by ChatGPT.
GPU, generative AI and data center
Until now, it was common for data centers to be equipped with only CPUs (Central Processing Units), but with the spread of AI, data centers are increasingly being equipped with GPUs in addition to CPUs. However, only about 10-20% of data centers are equipped with GPUs.
However, the situation will change with the spread of generative AI.
Generative AI, such as image generation and natural language generation, requires a process called “inference,” in which an AI model created through training is run to reach a conclusion. The inference process requires more computation than the learning process. Therefore, it is necessary to have a GPU that is also suitable for a large amount of computation.
It is clear that generative AI will be the primary information-generating task in most of the world’s data centers in the future, and that within another decade, most of the world’s data centers will be equipped with GPUs.
In NVIDIA’s most recent quarterly results (May-July), sales in the data center division more than doubled in just three months, even though shipments are not keeping up with demand due to a severe supply shortage. Analysts expect the division’s revenues to exceed $60 billion in the next fiscal year (ending January 31, 2025), more than four times last fiscal year’s (ending January 31, 2023).
Why does Nvidia have such a strong lead?
Background to NVIDIA’s near monopoly on the GPU market
NVIDIA was positioned to advance AI very early on. In 2006, NVIDIA announced CUDA, a programming language for developers to write applications for GPUs. CUDA became an important component for subsequent AI projects.
CUDA eventually grew to include 250 software libraries used by AI developers, and this breadth effectively made NVIDIA the go-to platform for AI developers.
CUDA protects NVIDIA as a competitive “dig” that rivals can never overcome. In a July conference call hosted by Bernstein Research, former NVIDIA Vice President Michael Douglas noted that software is the key to NVIDIA’s ability to pull away from its competitors. He predicted that much of the performance improvement of Envidia’s systems over the next few years “will be software-driven, not hardware-driven.”
The key to Nvidia’s monopoly was software development.
For the time being, NVIDIA remains strong in the market.
For the time being, the market for GPUs for data centers is expected to be almost exclusively dominated by NVIDIA.
Nevertheless, competition is likely to intensify. In addition to competition with semiconductor manufacturers such as Intel and AMD (Advanced Micro Devices) that already handle GPUs, giant IT companies such as Google, Amazon, and Meta are also beginning to develop their own AI semiconductors.
Along with the further evolution of generative AI and NVIDIA’s developments, we will also be keeping a close eye on other companies dealing with GPUs.
2023.10.26
Demand for AI is growing rapidly due to advances in digital technology and the widespread use of smartphones, and this in turn is driving demand for data centers to store information servers for businesses and other organizations.
Why does the growing demand for AI require data centers in the first place?
Expanding use of AI and data centers
The digitalization of society, known as “DX” (Digital Transformation), is progressing. Businesses are increasingly using AI, with “ChatGPT,” in which AI answers questions in natural sentences, in the limelight, and an ever-increasing number of services requiring quick information processing, such as smartphone shopping sites, social networking sites (SNS), and game applications.
As the scale of AI use expands and demands become more diverse every day, companies need stable and secure data and system operations.
Data centers are specialized facilities where IT equipment, network equipment and servers are stored securely.Even when IT equipment should be managed in-house, there may be insufficient space and security measures. To solve these various problems, more and more companies are turning to data centers. Data centers are excellent for operating and managing IT equipment and can meet the requirements of various companies.
Massive adoption of AI has led to increased investment in data centers
Since November 2022, the mass adoption of AI, symbolized by the huge success of ChatGPT, has generated California gold rush-like interest. The investment market is once again taking advantage of the Silicon Valley investment zeitgeist to reward pioneering companies like NVidia, Google, and Microsoft that have earned first-mover advantage through extensive R&D efforts. Investors are naturally looking for the next group of beneficiaries, avoiding high-risk or loss-making companies.
NVidia’s Q1 2024 earnings call was a turning point for the AI supply chain, as it highlighted the recent phenomenal demand growth for Nvidia’s hardware, leading analysts to revise their full-year revenue estimates upward by about 40%. (Source: Refinitiv)
The company mentioned “data centers” more than 56 times when explaining its better-than-expected revenue outlook during its May 24, 2023 investor earnings call. It was clear that the company’s advanced GPUs are entirely dependent on a high-performance, secure, and stable data center environment.
Issues with AI-enabled data centers
AI-enabled data centers also have their problems. Machine learning and AI applications require a lot of power for HPC and GPU servers. These servers consume so much power that they cannot fit in a single rack. The high power consumption of HPC and GPU servers is uncontrollable, so the servers accumulate a lot of heat and need to be cooled properly. Therefore, AI-enabled data centers are further required to improve power-saving and air conditioning technologies.
Summary
In this issue, we have discussed the growing use of AI and the demand for data centers.
Data centers will continue to play an important role in providing consumers and businesses with new AI tools as a critical infrastructure for the digital economy. They will also be required to provide services that meet the technological trends and business needs of companies.
We hope that data center operators will ride the wave of DX by carefully assessing and discussing the issues involved.
2023.10.18
The global market for data center liquid cooling is projected to reach US$2.6 billion to US$7.8 billion during the forecast period (2023-2028), at a compound annual growth rate (CAGR) of 24.84%.
According to the latest study, among the various regions of the world, Asia Pacific is expected to be the fastest growing region in the data center liquid cooling market during the forecast period. There is a growing awareness of the value of sustainable practices and green data centers in Asia Pacific.
What is a liquid cooling system?
In order to reduce greenhouse gas emissions from data centers, green data centers are being built using renewable energy and the development of energy efficient solutions is accelerating.
According to current estimates, data centers consume about 3% of the world’s total electricity, and nearly all of the energy used in data centers is related to cooling.
Liquid cooling systems use water or other liquids to directly cool servers and other equipment in data centers. These systems are more efficient than air-based cooling systems, but require specialized equipment and maintenance. Cooling is essential for data centers to ensure that equipment operates at optimal temperatures and avoids overheating that can lead to system failure and data loss.
Background to the expansion of the liquid cooling market
Advances in technology have made liquid cooling easier to maintain, easier to expand, and more affordable. As a result, liquid usage has been reduced by more than 15% in built-in hot and humid climate data centers and 80% in cooler regions. The energy dedicated to liquid cooling can be recycled to heat the building and water, and the application of advanced engineered coolants can effectively reduce air conditioning’s carbon dioxide emissions effectively.
2023.09.27
The edge data center market is expected to expand significantly at an estimated compound annual growth rate (CAGR) of 22% from 2023 to 2031.
According to Research And Markets, this surge is attributed to the growing need for low-latency data processing and storage solutions. Edge data centers are in close proximity to end users, which speeds data processing and reduces network congestion.
The main drivers for this are the growing population, rising trends in smartphone usage, and the push for smart cities in Asia Pacific, especially in China, Japan, and India, which are fueling demand for edge data centers. In addition, the growing adoption of cloud services and the expansion of e-commerce are also contributing to the rapid market growth in Asia Pacific.
In India, Indian digital technology company Varanium Cloud announced in February that it will launch its second edge data center under its Hydra brand in Kudal, Sindhudurg, Maharashtra, India.
According to the company, the edge data center will be housed in a shipping container to enhance accessibility and enable effective data sharing and communication in even the most remote areas of the country. They are also small and portable, allowing them to be easily transported to any location in the country.
What is the Edge Data Center?
Edge data centers are data centers that are located in close proximity to each other to prevent network latency and enable low-latency processing.
The advantage of an edge data center is that it does not require the development of a wide-area communications network and provides strong security.
When connecting to an edge data center via the Internet or other means, the use of edge servers is effective. Placing the contents as close to the site as possible stabilizes the communication path and prevents delays by using a roundabout communication path.
Status of Edge Data Center Utilization
In general, edge data centers are favored by telecommunications companies because of their latency capabilities with end users, but they have also become essential in many other areas.
In the healthcare industry, electronic medical records have been increasingly adopted and vast amounts of patient data have been accumulated. Edge data centers are being utilized to store this vast amount of data and manage the information at a high level of security.
By utilizing edge data centers, appropriate medical care can be provided without communication delays.
In the financial industry, edge data centers enable smooth transactions without delays. More profit can be gained by processing large amounts of data and ensuring secure communications.
IoT devices, which are hardware programmed for specific applications and transmitting data, are also active subscribers, especially in time-critical scenarios. Even in manufacturing, where predictive maintenance exists, edge equipment can help improve inventory management efficiency.
Contributes to decentralized edge data center communications
The number of people teleworking has increased due to COVID19 and online communication, such as web conferencing, has become more common.
The ability to work remotely has led to an increase in communication traffic around the world. The increase in the number of companies offering subscriber services has also contributed to the increase in the scale of personal communication.
Against this backdrop, the expansion of data centers is evolving as communications are concentrated in large data centers. However, unipolarization of communication destinations can have a significant impact on social infrastructure in the event of a disaster.
Edge data centers are attracting attention, perhaps partly for the purpose of contributing to the decentralization of communications.
Toward the Expansion of the Edge Data Center Market
Edge computing is necessary for building a high-speed, latency-free communication environment. It is important to design a network that successfully combines cloud computing and edge computing.
Companies and service providers working on IoT will need to consider building a new infrastructure environment based on a good understanding of the characteristics of edge data centers.
2023.09.15