Advanced

Did You Know? GIGABYTE’s Servers Are Helping to Launch Europe’s Space Ambitions

by GIGABYTE
As environmental protection is becoming a pressing issue in recent years due to climate change and global warming, data centers – which traditionally consume a huge amount of electricity – are being designed to better meet the goals of energy efficiency and carbon reduction. When a national aerospace center in Europe wanted to expand their data center, with a requirement for servers that could maintain normal operation in an ambient temperature environment of 40°C without the need for air conditioning systems, GIGABYTE’s server team provided a solution with Hyper-Converged Infrastructure (HCI), combining compute, storage and networking into a single system, and equipped with liquid cooling technology to successfully overcome the harsh high temperature environment, enabling the data center to efficiently process an enormous amount of space-related research data within a limited space.

On the 50th Anniversary of the Moon Landing, Space Research is Booming

Fifty years ago, the American astronaut Neil Armstrong lowered himself down off the lunar lander and became the first human being to step foot on the moon, uttering the famous remark “That's one small step for a man, one giant leap for mankind”, and creating a new milestone in mankind’s exploration of space.


People have always been very curious about not only the moon but also other regions of outer space, and scientists always look to gain new insights into the formation and evolution of our solar system. The exploration of Mars is one area of high interest, even though with our current level of technological development it will be much more difficult for humans to set foot on this planet compared to the Moon. Japan, France and Germany have begun to work together from June 2019 to jointly develop the Mars Moon Exploration (MMX) Program, which aims to launch an AI-enabled robotic space probe in 2024 equipped with a rover that will go into orbit around Mars, in order to understand more about the evolution of Earth’s red neighbor and the origin of its moons through robotic observation and sample collection.

Giving the Aerospace Center a Helping Hand for Large, Complex Computing Workloads
Scientists at this national aerospace center in Europe have been studying the Earth and Solar System for decades, have designed robotic probes for Mars space missions, and have co-operated with other international organizations to develop space shuttles and rockets. These highly precise technologies are often developed using large and complex sets of data, such as images returned from outer space, or trajectories and altitudes from spacecraft flight recorders, which are then rapidly analyzed using the computational capabilities of powerful servers, in order to conduct more research and development.

In response to ever more research programs, countless amounts of data requiring storage and increasingly complex computing needs, the aerospace center launched a new project to expand their data center.《Glossary: What is Data Center?
Overcoming The Project's Challenges
● A data center ambient temperature of 40°C with no air conditioning equipment
● Find the most suitable cooling method for the servers without changing the existing mechanical and electrical infrastructure within the data center
● The temperature of the liquid used for heat exchange should not exceed 60°C
With more than 20 years of experience in the server industry, GIGABYTE teamed up with a local systems integrator to take on this project, using our H261-Z60 Hyper-Converged Infrastructure (HCI) servers to overcome the project’s challenges. This server can provide up to twice the maximum compute power of competing products on the market, allowing the use of space within the data center to be reduced by 50%, and meeting the aerospace center’s requirements of high speed compute and storage capabilities in a limited space. To provide the best cooling solution for the data center, the servers were also equipped with a CoolIT liquid cooling system.
The Future Trend of Data Centers: Energy Conservation & Improved Space Utilization Efficiency
Data centers with higher compute performance capabilities will consume more and more energy for cooling. Recently however Europe is often experiencing heat waves, and the average annual temperature is increasing year after year. Therefore, for the expansion project of this data center it was decided to adopt more energy efficient liquid cooling technology instead of using air conditioning equipment to dissipate heat.

Currently, there are two main liquid cooling technologies commonly available on the market: Immersion Cooling, and Direct Liquid Cooling. Since there was only limited space available, and the aerospace center did not wish to modify the existing mechanical and electrical infrastructure of the data center, Direct Liquid Cooling was chosen for the cooling solution. At the same time however, it was feared that if the temperature of the liquid discharged from the heat-dissipation pipes was too high, it could cause a high amount of wear and tear on the system’s components in the long term. Therefore, it was requested that the temperature of the fluid used for direct liquid cooling should not exceed 60 °C.
An Immersion Cooling system uses a non-conductive dielectric fluid to assist the server with heat dissipation.
A Direct Liquid Cooling system uses liquid in heat-dissipation pipes to remove the heat from the server.
GIGABYTE’s Senior Product Marketing Manager Andie Yen talked about her experience when she first began working on the project: “I brought all the technical information and data that the customer gave me back to the R&D department for communication and discussion”. Recalling that day, she had listened to the R&D team explain to her some fundamentals for more than two hours, before she was finally able to calculate the thermal energy consumption of all new servers required for the data center. After that, she continued to work together with R&D members to repeatedly study, re-plan and re-verify various server configurations, in order to present the most suitable customized design to the customer for their data center.

Direct Liquid Cooling technology involves a dense configuration of liquid cooling pipes installed within the server chassis, assisting in heat dissipation via liquid circulation. You can imagine it similar to the water heater in your house, which turns cold water into hot water as it runs through the pipeline. Now, in the opposite case, cold liquid is pumped into the server chassis via these cooling pipes. The pipes run a circular route inside of the server, allowing the liquid to absorb heat energy emitted by the CPU, GPU and memory, and becoming hot in the process. The hot liquid then flows back outside of the server to a regulator at the top of the server rack, where it is converted back into cold liquid and re-used. Through the joint efforts of GIGABYTE’s technical team, the temperature of the hot liquid exiting the server was finally controlled at 58°C, successfully meeting the aerospace center’s requirement of under 60°C.
Efficiency Has Reached New Heights Now That More Scientists Can Perform Their Work
The GIGABYTE server product team strived to optimize the data center’s new server configuration, hoping to improve the working efficiency of the customer. Each R281-Z94, the server that handles user connections and data management, was equipped with two brand new RTX5000 graphics cards, which can support up to twice as many user connections simultaneously as other typical graphics cards. Accordingly, each management server can now simultaneously allow up to 64 different aerospace scientists to connect to the computing cluster to perform calculations, resulting in a large amount of waiting time saved.
GIGABYTE is Committed to Helping the Aerospace Center Save Energy
Even more importantly, in order to help the customer reduce operating expenditure, GIGABYTE’s server product team optimized the server’s power supplies, resulting in a power demand reduction of 15% compared with competing products - reducing energy consumption to allow for huge electricity bill savings, and bringing even more benefits to the customer. Andie Yen emphasizes that “We often tell our customers that you don’t just want to look at how much money our proposal is saving you on the books, but how much long term expenditure such as electricity and water fees will also be saved later by using GIGABYTE’s server products”.

Together with the promotion of energy-conservation and carbon-reduction policies in various countries, data centers that traditionally consume a huge amount of electricity are bound to be designed to be more environmentally friendly and energy saving in the future. In this project, GIGABYTE’s server team not only fulfilled our customer’s requirement of obtaining maximum computing capabilities in a limited space, but was also able to resolve their main concerns about equipment temperature tolerance and energy conservation, by providing fantastic results for heat dissipation and power consumption – not only succeeding in obtaining more opportunities for co-operation in the future, but also doing our best to protect the environment.

《Additional Information: GIGABYTE Can Provide Many Different Solutions for Your Data Center


Realtion Tags
Hyper-Converged Infrastructure
PUE
Data Center
Artificial Intelligence
Liquid Cooling
WE RECOMMEND
RELATED ARTICLES
COMPUTEX 2019 Booth Tour

We can't take you to the future with a post, but we can take you to our GIGABYTE booth, and show you what we have on display this year.

GIGABYTE Server Liquid Cooling Solutions

GIGABYTE can now offer various server platforms with liquid cooling technology, such as direct-to-chip liquid cooling (liquid to liquid or liquid to air), one-phase oil immersion or two-phase liquid immersion technologies. Liquid cooling systems can support a greater density of CPUs and GPUs, enabling better compute performance in a given amount of space, as well as helping customers to reduce their power consumption for cooling infrastructure, to achieve a better data center PUE.