News & Topics
The global market for data center liquid cooling is projected to reach US$2.6 billion to US$7.8 billion during the forecast period (2023-2028), at a compound annual growth rate (CAGR) of 24.84%.
According to the latest study, among the various regions of the world, Asia Pacific is expected to be the fastest growing region in the data center liquid cooling market during the forecast period. There is a growing awareness of the value of sustainable practices and green data centers in Asia Pacific.
What is a liquid cooling system?
In order to reduce greenhouse gas emissions from data centers, green data centers are being built using renewable energy and the development of energy efficient solutions is accelerating.
According to current estimates, data centers consume about 3% of the world’s total electricity, and nearly all of the energy used in data centers is related to cooling.
Liquid cooling systems use water or other liquids to directly cool servers and other equipment in data centers. These systems are more efficient than air-based cooling systems, but require specialized equipment and maintenance. Cooling is essential for data centers to ensure that equipment operates at optimal temperatures and avoids overheating that can lead to system failure and data loss.
Background to the expansion of the liquid cooling market
Advances in technology have made liquid cooling easier to maintain, easier to expand, and more affordable. As a result, liquid usage has been reduced by more than 15% in built-in hot and humid climate data centers and 80% in cooler regions. The energy dedicated to liquid cooling can be recycled to heat the building and water, and the application of advanced engineered coolants can effectively reduce air conditioning’s carbon dioxide emissions effectively.
2023.09.27
The edge data center market is expected to expand significantly at an estimated compound annual growth rate (CAGR) of 22% from 2023 to 2031.
According to Research And Markets, this surge is attributed to the growing need for low-latency data processing and storage solutions. Edge data centers are in close proximity to end users, which speeds data processing and reduces network congestion.
The main drivers for this are the growing population, rising trends in smartphone usage, and the push for smart cities in Asia Pacific, especially in China, Japan, and India, which are fueling demand for edge data centers. In addition, the growing adoption of cloud services and the expansion of e-commerce are also contributing to the rapid market growth in Asia Pacific.
In India, Indian digital technology company Varanium Cloud announced in February that it will launch its second edge data center under its Hydra brand in Kudal, Sindhudurg, Maharashtra, India.
According to the company, the edge data center will be housed in a shipping container to enhance accessibility and enable effective data sharing and communication in even the most remote areas of the country. They are also small and portable, allowing them to be easily transported to any location in the country.
What is the Edge Data Center?
Edge data centers are data centers that are located in close proximity to each other to prevent network latency and enable low-latency processing.
The advantage of an edge data center is that it does not require the development of a wide-area communications network and provides strong security.
When connecting to an edge data center via the Internet or other means, the use of edge servers is effective. Placing the contents as close to the site as possible stabilizes the communication path and prevents delays by using a roundabout communication path.
Status of Edge Data Center Utilization
In general, edge data centers are favored by telecommunications companies because of their latency capabilities with end users, but they have also become essential in many other areas.
In the healthcare industry, electronic medical records have been increasingly adopted and vast amounts of patient data have been accumulated. Edge data centers are being utilized to store this vast amount of data and manage the information at a high level of security.
By utilizing edge data centers, appropriate medical care can be provided without communication delays.
In the financial industry, edge data centers enable smooth transactions without delays. More profit can be gained by processing large amounts of data and ensuring secure communications.
IoT devices, which are hardware programmed for specific applications and transmitting data, are also active subscribers, especially in time-critical scenarios. Even in manufacturing, where predictive maintenance exists, edge equipment can help improve inventory management efficiency.
Contributes to decentralized edge data center communications
The number of people teleworking has increased due to COVID19 and online communication, such as web conferencing, has become more common.
The ability to work remotely has led to an increase in communication traffic around the world. The increase in the number of companies offering subscriber services has also contributed to the increase in the scale of personal communication.
Against this backdrop, the expansion of data centers is evolving as communications are concentrated in large data centers. However, unipolarization of communication destinations can have a significant impact on social infrastructure in the event of a disaster.
Edge data centers are attracting attention, perhaps partly for the purpose of contributing to the decentralization of communications.
Toward the Expansion of the Edge Data Center Market
Edge computing is necessary for building a high-speed, latency-free communication environment. It is important to design a network that successfully combines cloud computing and edge computing.
Companies and service providers working on IoT will need to consider building a new infrastructure environment based on a good understanding of the characteristics of edge data centers.
2023.09.15
The Digital Agency has announced a policy to ease the selection requirements for providers of government clouds (government clouds) shared by the government and local governments.
Government cloud refers to a common infrastructure of cloud services used by the government and local governments. Called the government cloud, the government has set a goal to make systems related to 20 mission-critical tasks handled by municipalities, such as taxes and national pensions, available on the government cloud by the end of fiscal 2025.
The current rules, which require a single company to meet approximately 330 requirements, will be revised to allow for participation by a coalition of companies. This may make it easier for domestic companies to enter government cloud services that rely on foreign capital.
Until now, there have never been many businesses that can satisfy the wide variety of selection requirements on their own. In the 2022 public offering, cloud services from US companies that meet technical requirements such as security and business continuity have demonstrated their presence. In addition to Amazon Web Services (AWS), there are only four companies: Google, Microsoft Japan, and Oracle Japan. selected.
Domestic companies are unable to meet the requirements due to the scale of their business and the nature of their services. Particular hurdles include building a system to provide operational support from system development, using multiple data centers, and providing a development environment where artificial intelligence (AI) can perform machine learning.
Only giant IT (information technology) companies, known as “hyperscalers” such as AWS and Google, are able to achieve this on their own.
The agency is expected to announce new selection requirements for government cloud providers and start public offerings as early as August. The selected businesses will be allowed to provide services jointly with other companies as long as they are responsible for core technologies such as data management and authentication while maintaining most of the current items in the new requirements.
The selection of a government cloud provider is expected to be decided in late October.
Background of Easing Selection Requirements
The background behind this easing of selection requirements is
“We should review the selection criteria for cloud service providers that store and provide government cloud services.
and “the selection criteria for cloud service providers that store and provide government cloud services should be reviewed. It is extremely likely that companies like Sakura Internet and Internet Initiative Japan will seize this new opportunity to enter the domestic cloud market.
Despite these amendments, however, the Digital Agency stated that the right of municipalities to select providers will be maintained. In other words, the changes that the amendments will bring to the actual selection process may only be limited.
Many cloud service providers are requesting joint participation in the government cloud by multiple companies. The results of the selection process after the requirements are eased are likely to attract much anticipation and attention.
2023.08.31
In this issue, we will first look at the current status and future intentions of the data center business of Japanese companies.
NTT
The company plans to spend more than 1.5 trillion yen over the next five years to expand its data centers. India will be the place where NTT will increase the most, and they plan to increase the number from the current 12 to about 24 by fiscal 2025, where potential demand is expected due to the expansion of major overseas IT companies and population growth. NTT also wants to increase the number of locations in North America from 14 to 23.
Softbank
SoftBank Corp., in collaboration with U.S. semiconductor giant NVIDIA, aims to build a platform for generative artificial intelligence and 5G/6G applications and introduce it to new AI data centers in Japan.
This application is based on NVIDIA chip technology. And to reduce costs and improve energy efficiency, SoftBank plans to build data centers that can host generative AI and wireless applications on a multi-tenant common server platform.
Kansai Electric Power
Kansai Electric Power Company (KEPCO), in partnership with U.S. data center operator CyrusOne, has begun work on developing a data center in Japan with the ambitious goal of achieving a 900 MW operation. the CyrusOne KEP joint venture is a hyperscale platform company Focusing on the development of new data centers specifically tailored to meet demand, the joint venture aims to enhance resiliency, efficiency, and smart development in the industry by linking data center infrastructure with the broader power grid.
In this way, the data center industry of Japanese domestic companies seems to be invigorating and achieving positive growth.
So what is behind this?
Background of the activation of the data center industry
Behind this is the progress of digitization, such as generative AI (Artificial Intelligence). If we become a data-driven society in which decisions are made based on data, data will accumulate at an accelerated pace.
Akira Shimada, president of NTT, which is focusing on the data center business, said, “We want to develop semiconductors that use light (instead of electrons) after 30 years. We will invest 100 billion yen per year in research and development. As a start, we plan to begin manufacturing related components that use light after 2013. In addition to incorporating them into telecommunications equipment and servers, we also aim to apply them to more general electronic devices.” and suggests that the data center business is closely linked to semiconductor development.
Semiconductors that use light will consume far less electricity, which is in line with the times in terms of sustainability.
Junichi Miyagawa, President and CEO of Softbank, said, “We are entering an era of coexistence with AI, data processing, and rapidly increasing demand for electricity. We aim to provide next-generation social infrastructure to support a digitalized society in Japan.
Today, the development of generation AI (Artificial Intelligence) is remarkable. We may be at a turning point, upgrading the services we deploy regarding our growth strategy.
2023.08.10
As the use of the cloud accelerates, it is said that the so-called “return to on-prem” movement, in which in-house systems that were once cut out to the cloud are returned to on-premises, is becoming apparent.
Last year, the Rakuten Group decided to return to on-prem. We are expanding the environment of the private cloud “One Cloud” and promoting the integration of the IT infrastructure used by the various businesses of our group companies. In principle, many systems currently running on public clouds will be shifted to One Cloud. In addition to improving cost efficiency by promoting the consolidation of IT infrastructure into a private cloud for the entire group, the company plans to accumulate IT infrastructure know-how for stable operation and enhanced security.
Private clouds will also be used as the basis for IT services for corporations that are planning to enter the market. The plans include eKYC for identity verification, website access analysis, and electronic payment functions. Both technologies were developed for use in the Group’s business, and preparations are underway to sell them externally as pay-as-you-go public cloud services.
With the advent of cloud first, it is said that opportunities to introduce on-premises servers are definitely decreasing for many companies. However, if you turn your attention to the server market, the movement is still strong. At first glance, it seems contradictory, but what is behind this?
Background of “on-premise regression”
The server market seems to be growing favorably in 2022, with a year-on-year increase of 10-20%.
Even in the early 2000s, when server virtualization began to become popular, it was said that servers would not sell as a result of server consolidation. However, in reality, this is not the case. Virtualization has made it easier to procure servers, and conversely, the introduction of various systems has become more active, leading to the demand for more servers.
Currently, with the tailwind of DX, IT investment is becoming active, and system utilization that was not possible before is spreading. Given the rapid increase in server resources required by the cloud, the expansion of the server market is rather natural.
On the other hand, one of the reasons behind the recent boom in the use of on-premise servers by general companies is that misunderstandings about the cloud have been cleared after actually using it. In retrospect, the cloud has attracted great expectations for its ability to use resources at extremely low cost and to reduce the workload by cutting out operations to the outside.
However, in reality, there are many cases in which unexpectedly high charges are billed as a result of using the cloud without understanding the characteristics of the cloud in terms of cost.
Also, in terms of operation, the hardware amulet is gone, but the operation of the system itself still remains. Cloud management requires different knowledge than on-premises, and in the current situation where many companies have systems on both on-premises and in the cloud, dual management will inevitably occur. This is no small burden for busy IT departments.
There are also security issues. Existing legacy systems that handle highly confidential data cannot be abolished on-premises because they cannot be operated on public clouds. As a result, IT operations become more complex, leading to problems such as increased operational management loads.
As the understanding of these “realities” has progressed, there is a swing back to the style of coexisting with the cloud, returning the systems that were once cut out to the cloud and returning them to the on-premises, starting with those that were judged to be “not suitable”.
On-pre regression will progress in the present progressive form. However, companies that have experienced the cloud know its advantages. Is the conventional on-premise IT infrastructure that such companies should aim for?
There are two approaches to the current “on-premise regression”
There are currently two approaches to on-premise regression. One way is to leave the server-related data in the cloud and return only the key data in DX to the on-premises. Another is to bring the whole system back on-premises. The problem is the latter method.
It is clear that it is not a 3 Tier type (a system configuration in which a group of servers and shared storage are connected with a network fabric) designed with SPOF (single point of failure) that emphasizes only cost. However, It doesn’t mean that the 3-tier model is bad for all proposals. Even after considering appropriate measures to address issues, the larger the scale, the more complex it becomes. No one wants to go back to a situation in which each time a review is made, discussions between the people in charge of servers, storage, and networks lead to long lead times, and issues such as hardware generations cause high replacement costs.
What we should aim for is the adoption of a cloud-like virtualization platform. From that point of view, HCI (Hyper Converged Infrastructure), which realizes a cloud-like system infrastructure, is currently attracting attention. It seems that this HCI has greatly improved the evaluation. HCI, which implements pre-verified server, storage, and network functions in software and stores them in a single box, and is provided together with virtualization middleware, greatly simplifies the configuration of the IT infrastructure. Even without specialized knowledge, resources can be easily expanded by adding nodes, achieving scalability close to that of the cloud.
HCI, which can maintain a simple configuration, is likely to continue to attract attention in the future, in contrast to the 3-tier configuration, which increases in complexity as it grows in scale.
2023.07.25
Irish government restricts data center development
Ireland’s The Commission for Regulation of Utilities (CRU) has decided to limit the impact by imposing a de facto moratorium on new data center development in the Dublin metropolitan area.
Ireland’s national transmission operator EirGrid said in response that it would only consider new applications for grid connection on a case-by-case basis. The restrictions could reportedly last until 2028.
Martin Shanahan, CEO of Ireland’s Industrial Development Authority (IDA), recently said that new data centers “are unlikely to occur in Dublin and the East Coast at this time.”
Google has asked such Irish regulators not to impose a moratorium on data center development in the country.
In The Commission for Regulation of Utilities (CRU) filing, the company said search and cloud companies must “absolutely” avoid a moratorium on data center development.
Google said such a ban would send a “wrong signal” about Ireland’s digital economy ambitions, and would affect the country’s infrastructure, according to a Freedom of Information request first reported by The Irish Times. It adds that it makes further investment “impossible”.
In the filing, Google called for more transparency about where the Irish network has existing power capacity, as well as being clearer and more open about EirGrid’s projections of data center power usage growth. I think you need to.
Growing Demand for Cloud Computing, Google’s Proposal
Google, which launched its first data center in Ireland in 2012, has proposed a new pricing structure for data center operators who reserve more capacity than they ultimately need or grow to that capacity too slowly. bottom.
“Transmission tariffs can be designed so that consumers who are not seeing increased demand towards maximum reserved capacity will be charged more than consumers who are demonstrating an increase each year.” says.
EirGrid and politicians have previously suggested moving data center development to the west of Ireland (away from Dublin’s constrained areas and closer to renewable energy sources), but Google says this is not a viable solution. I point out that it is not.
“The demand for cloud computing in Dublin is growing. We are unable to provide services.”
Another AWS filing says Ireland has missed opportunities in the past to address supply issues.
“Over the past decade, we have had opportunities to do reinforcement work, prepare the grid for growth and investment, and prepare the grid for more intermittent integration of resources,” he said.
Both the Social Democrats and the People Before Profit parties have been calling for a nationwide moratorium on future data center projects for the past 12 months. The PBP bill was an absolute ban on data centers, liquid natural gas plants and new fossil fuel infrastructure.
In Dublin last month, South Dublin County Council (SDCC) voted to block future data center construction in the county as part of a new development plan.
What is the background behind the Irish government’s moratorium on data center development?
Irish Government Behind Data Center Development Moratorium
The Irish government’s achievement of emissions and renewable energy targets is behind this.
According to EirGrid, data center energy usage is projected to increase by 9TWh by 2030, ranging from 23% to 31% of Ireland’s grid supply in 2030. This comes at a time when the government wants to reduce emissions by 60-80% by increasing the share of renewable energy. At the same time, governments want to decarbonise by moving heating and transportation to electricity, further increasing demand on the grid.
According to The Irish Times, EirGrid has agreed to connect an additional 1.8GW of data centers to the grid, with current peak demand of around 5GW, and a further 2GW of applications ready. That’s it.
The Government Statement on the Role of Data Centers in Ireland’s Enterprise Strategy 2018, published in 2018, emphasized the positive role of data centers in the country’s economic performance. However, it will now be “aligned with sectoral emissions caps and renewable energy targets, concerns about continued security of supply, and demand flexibility measures currently needed. In order to secure it, it will be reviewed. “In addition, further tightening of regulations will be considered,” it is reported.
Will it work or will it backfire?
The Irish government imposes a moratorium on data center development, which is in high demand worldwide. It seems that the moratorium continues while receiving a warning from Google. Will this decision work or will it backfire? We will keep an eye on trends.
2023.07.05