Lowell Observatory Looks for Habitable Exoplanets with GIGABYTE Servers

Arizona’s Lowell Observatory is studying the Sun with GIGABYTE’s G482-Z50 GPU Server in an effort to filter out “stellar noise” when looking for habitable planets outside of our Solar System. The server’s AMD EPYC™ processors, parallel computing capabilities, excellent scalability, and industry-leading stability are all features that qualify it for this astronomical task, making the discovery of a true “Twin Earth” achievable within our lifetime.
Lowell Observatory in Flagstaff, Arizona, seeks to answer one of humankind’s oldest existential questions: are we alone in the universe?

To discover the truth, they search for potentially habitable extrasolar planets, also known as exoplanets—planets outside of our Solar System that may support life.

First, a bit of background. Established in 1894, Lowell Observatory is among the oldest observatories in the United States, and a National Historic Landmark. Over the years, it has participated in numerous scientific breakthroughs, the most noteworthy of which was the discovery of Pluto in 1930. TIME Magazine has called it one of “The World's 100 Most Important Places”.

Lowell Observatory and Yale University have teamed up to spearhead the “100 Earths Project”. It is a mission to look for exoplanets with characteristics similar to Earth, known as “Earth analogs” or “Twin Earths”. Two criteria must be met: first, the exoplanet needs to be located in the “circumstellar habitable zone” (CHZ), which means it is at a suitable distance from its parent star for the planetary surface to support liquid water, a prerequisite for life. Second, the parent star needs to be similar to our Sun in age, size, and temperature—a so-called a “solar analog”. It is hypothesized that intelligent extraterrestrial life may be possible on such an exoplanet.

The search is far from easy. The glow of an exoplanet is so faint compared to its star, it is akin to looking for a firefly buzzing around a lighthouse—from light-years away. One method of discovery is transit photometry, which looks for the miniscule dimming of the star as it is eclipsed by an orbiting exoplanet. These are called “transit events”, and they reveal the planet’s volume. Another method is radial velocity, also known as Doppler spectroscopy. A satellite, even a relatively small one like Earth, causes the position and velocity of its parent star to shift (or “wobble”) ever so slightly as the two celestial bodies orbit their common center of mass. By measuring the Doppler shifts in a star’s spectrum, astronomers can detect the presence of exoplanets and calculate their mass. Since mass divided by volume equals density, scientists can deduce if an exoplanet is an inhospitable ball of gas, or a piece of rock floating through space, much like our precious blue planet.

The biggest hurdle lies in the fact a planet’s influence on a star is vanishingly small. Our own Earth causes the Sun to “wobble” with a radial velocity of just 10 centimeters per second over the course of a year. An extremely powerful optical spectrometer is needed to detect such an insignificant change. To this end, the Yale Exoplanet Lab has built a state-of-the-art high-resolution “EXtreme PREcision Spectrometer” (EXPRES), to be used in conjunction with the 4.3-meter Lowell Discovery Telescope (LDT).

Dr. Joe Llama, an astronomer and astrophysicist at Lowell, has anticipated another issue. When it comes down to it, a star is a sphere of hydrogen undergoing thermonuclear fusion. There are bound to be irregularities and fluctuations. These are known as “stellar noise”, and they could mask the already tiny signal of exoplanets. He devised a plan to study this superfluous noise, so he can prevent it from interfering with the EXPRES’s findings.
“Downloading” Data from the Sun to GIGABYTE’s GPU Server
Not far from the LDT, Dr. Llama set up its smaller sibling: the 70-millimeter Lowell Observatory Solar Telescope (LOST). He hooked it up with the EXPRES and GIGABYTE’s G482-Z50, a 4U 10-Node G-Series GPU Server, and got to work. In the search for life outside of our Solar System, Dr. Llama began by staring at the Sun.

Learn More:
《More information about GIGABYTE's GPU Server
The 70-millimeter Lowell Observatory Solar Telescope is installed on the roof of an auxiliary building near its larger sibling, the 4.3-meter Lowell Discovery Telescope. An 80M optic fiber cable transmits the sunlight from the Solar Telescope to the EXPRES, which ultimately converts the light into useful Doppler data with the aid of the G482-Z50. (Images provided by Lowell Observatory)
While the LDT combs the night sky in search of exoplanets, the Solar Telescope studies the Sun in the daytime. “Everyone is lining up to use the EXPRES at night, but only I have exclusive access to the sun, all day,” Dr. Llama quips.

His plan is to create a spectrum of our yellow star as it passes through its eleven-year solar cycle. Dr. Llama believes there is a rhyme and reason to the stellar noise, and it will reveal the “common signature” of the stars in the sky. Omitting that universal noise will not only help astronomers pinpoint the location of a true “Twin Earth”, but also smaller objects, like the moons of exoplanets.

In layman’s terms, the process works like this: sunlight captured by the Solar Telescope is sent to the EXPRES via an 80M optic fiber cable. The EXPRES splits the light into its constituent color components, which are precisely captured by a powerful 10K x 10K charge-coupled device (CCD). Then, a reduction code is used to break the image down to a spectrum—a one-dimensional 10K image that reflects wavelength intensity. The spectrum is analyzed with a special computer program written by the “100 Earths Project” team and converted into Doppler data. Scientists hope these data points will shed light on the common signature of stars.

So long as the day is clear and sunny—which it is about three hundred days a year in Flagstaff—the EXPRES can churn out fresh findings every few minutes. The workload is nothing short of breathtaking: about 50GB of raw data is being “downloaded” from the Sun on a daily basis. At the same time, there are 10TB of accumulated data waiting to be analyzed. Both of these tasks must be carried out simultaneously.

Before getting in touch with GIGABYTE, the team at Lowell tried working with a network-attached storage (NAS) server equipped with hard disk drives (HDDs), and desktop computers to analyze the data. While serviceable, this setup was less than ideal, because a storage server is not designed for data-intensive computing. By Dr. Llama’s estimate, there would need to be thirty-six hours in a day for the NAS server to keep up with the constant influx of new readings.《Glossary: What is NAS?

What Dr. Llama really wanted was a server with top-of-the-line processing power, excellent parallel computing capabilities, faster read and write speeds, convenient scalability, and excellent stability. The very success of the “100 Earths Project” may depend on it. After all, there are other exoplanet hunters out there. Everyone wants to be the first to discover a true “exo-Earth”.

What is Parallel Computing?
What is Scalability?
Space Science Gets Ten Times Faster with GIGABYTE G482-Z50
It is fortunate, then, that the performance of the G482-Z50 is out of this world, pun very much intended. Exemplary of a new class of servers designed for data-intensive scientific research, the G482-Z50 is decked out with all the tools it needs to get the job done, including powerful processors, a scalable design, and smart safety functions to ensure system stability. It is an impressive creation to behold, even for a man who has a front row seat to the wonders of the universe.

Dr. Llama and his team set up the new server in Lowell Observatory’s data center. Now, every time the EXPRES sends back blown-up images of the Sun, the G482-Z50 does two things simultaneously: one, it runs a reduction code to extract useful spectra from the data; two, it uses a separate program to decode the spectra and record the changes in our Sun’s radial velocity, which may be the building blocks of the stars’ common signature.《Glossary: What is Data Center?
A spectrum of the Sun from the Lowell Observatory Solar Telescope, color-coded to approximate the color of light seen by human eyes. The vertical dark bands are absorption features of molecules in the atmosphere of the Sun. Dr. Llama uses the position of these bands to measure the solar radial velocity. (Images provided by Lowell Observatory)
With the G482-Z50 doing most of the heavy lifting, the same computing tasks are being carried out in a quarter, or even a tenth of the time. The search for exoplanets has received a shot in the arm. The research team could not be happier.

“The processing power provided by the G482-Z50 is not only great for the Solar Telescope, but for all the scientists at the Observatory,” says Dr. Llama. “Needless to say, we are very excited.”

Why is the new server just what the doctor ordered? Dr. Llama says there are three reasons why the G482-Z50 is proving itself instrumental in helping Lowell Observatory unlock the secrets of the galaxy:
1. Top-notch processing power suitable for parallel computing
2. Industry-leading scalability
3. Designed to ensure system stability
One: Top-notch processing power suitable for parallel computing
When it comes down to it, Dr. Llama’s work requires an intense amount of data processing and parallel computing. A hundred new data points are being gleaned from the sun on a daily basis. This is in addition to the tens of thousands of data points already stored on the old NAS server. The G482-Z50 runs a reduction code to turn new images of the Sun into spectra. At the same time, it is executing a second program to convert the spectra into useful Doppler data. This is an astronomical task, in every sense of the word.

The G482-Z50 is able to shine due in part to its dual AMD EPYC™ 7002 processors, which can house up to 64 cores and 128 threads in a single CPU. What’s more, the G-Series GPU Servers can support an extremely dense configuration of GPU accelerators. The G482-Z50’s 4U chassis can fit up to ten GPGPU cards. The CPUs are connected to the GPUs via PCIe switches to minimize latency. The G482-Z50 also supports the latest PCIe Gen 4.0, which has a maximum bandwidth of 64GB/s and is twice as fast as PCIe Gen 3.0. These attributes make the G482-Z50 ideal for parallel processing, high performance computing (HPC), data analytics, cloud computing, and many other applications.

What is HPC?
What is PCIe?

In the case of Lowell Observatory, the G482-Z50 was outfitted with a pair of AMD EPYC™ 7502 processors, which contain 32 cores and 64 threads in each CPU. The maximum single-core frequency is 3.35 GHz. This arrangement is well suited to the program used to analyze the spectra, as it is a parallelized, scalable piece of code benefiting from recent advancements in artificial intelligence and machine learning. Pair that up with a server specializing in performing multiple calculations simultaneously, and it should be no surprise the boost to research progress has been tremendous.

Learn More:
《Glossary: What is Artificial Intelligence?
《Glossary: What is Machine Learning?
The EPYC Rise of AMD’s New Server Processor
GIGABYTE Smashes 11 World Records with AMD EPYC™ Processor Systems
The G482-Z50 is a 4U 10-Node G-Series GPU Server that packs a wallop of computing power and excels at parallel computing. With its excellent scalability and stability, it is the ideal solution to help Lowell Observatory find the common signature of stars.
Two: Industry-leading scalability
One thing that might keep the research team up at night (besides searching for exoplanets) is the question of scalability. As the EXPRES churns out more and more readings, it is imperative to consider whether there is enough storage space for all the accrued information, not to mention whether the G482-Z50’s processing power can keep up with the ever-increasing challenge.

Dr. Llama’s solution to the storage problem is to move the drives from the original NAS server to the G482-Z50, which has ample capacity. The NAS can still be used to store data if the drives fill up. This effectively splits the computing and storage tasks between the two servers. The G482-Z50 can focus on processing the raw data while the valuable findings are transferred to the NAS server for storage. The G482-Z50 is able to work more quickly thanks to its fast read and write speeds. This setup also gives the research team the option of adding more storage servers, if necessary.

As for processing power, the aforementioned ultra-dense configuration of up to ten PCIe GPGPU cards means more accelerators can be added at a moment's notice. This ensures the G482-Z50 can maintain top-notch performance even as data starts to pile up. Since the spectrum analysis program is scalable and can make use of as many cores as are available, Dr. Llama has taken the precaution of installing sixteen sticks of 64GB RAM inside the G482-Z50, for a total of 1TB RAM.
Three: Designed to ensure system stability
In the race to discover an “exo-Earth”, not a minute of computation time can be wasted. The research team has the G482-Z50 crunching numbers twenty-four seven. It goes without saying that system stability is very important. A malfunction will not only cause a delay in the research, it may lead to the loss of precious data.

Since system failure often stems from suboptimal heat dissipation, the G482-Z50 comes equipped with dynamic fan speed control as standard, as is the case with the majority of GIGABYTE’s air-cooled servers. The baseboard management controller (BMC) monitors the temperatures of key components. It automatically adjusts the fan speed to keep everything nice and chill while delivering superb power usage efficiency (PUE).

Learn More:
What is BMC?
What is PUE?
GIGABYTE Tech Guide: How to Pick a Cooling Solution for Your Servers?
It’s (almost) always sunny in Flagstaff, Arizona. Around three hundred days a year, weather conditions allow the Lowell Observatory Solar Telescope to “download” up to 50GB of raw data from the Sun. About 10TB of accumulated data is being worked on at any given moment. (Images provided by Lowell Observatory)
In addition, GIGABYTE’s proprietary SCMP (Smart Crises Management/ Protection) feature forces the CPU to enter ultra-low frequency mode (ULFM) if the BMC should detect a dangerous fault or error, such as overheating or a power surge. This smart safety function prevents the system from shutting down. Once the issue has been resolved, the system will automatically return to normal power mode.

It should also be noted that components used in GIGABYTE servers are carefully selected to guarantee a stable operating environment and deliver maximum performance. The G482-Z50, like other GIGABYTE servers based on AMD EPYC™ 7002 processors, is designed for easy maintenance, with multiple tool-less features for convenient installation and maintenance. That way, even if Lowell Observatory needs to pause the search to perform routine maintenance, the G482-Z50 will be back up and running in a jiffy.

“In astronomy, we often joke we are always many years behind the latest computer technology,” says Dr. Llama. “But with the G482-Z50, not only do we have access to the computing power of AMD EPYC™ processors, we can also add GPU accelerators to calculate data even faster. We are grateful to be working with GIGABYTE.”

The search for habitable exoplanets and intelligent extraterrestrial life may sound like science fiction to some, but it is a worthy pursuit of knowledge. GIGABYTE is glad to support the effort with the latest breakthroughs in computational technology and server solutions. The GIGABYTE motto is “Upgrade Your Life”; it is a commitment to using tomorrow’s technology to solve the problems we face today, such as discovering the answer to the age-old question of whether we are alone in the universe.

Learn More:
Japan’s Waseda University Predicts and Prevents Natural Disasters with GIGABYTE’s Computing Cluster
CERN and the Large Hadron Collider Looks for Beauty Quarks with GIGABYTE’s GPU Servers
Realtion Tags
Cloud Computing
Machine Learning
Artificial Intelligence
Parallel Computing
Spain’s IFISC Tackles COVID-19, Climate Change with GIGABYTE Servers

Success Case

Spain’s IFISC Tackles COVID-19, Climate Change with GIGABYTE Servers

By using GIGABYTE, Spain’s Institute for Cross-Disciplinary Physics and Complex Systems is pitting the world’s foremost server solutions against some of the world’s most pressing issues, including the effects of climate change, pollution, and COVID-19. GIGABYTE servers are up to the diverse and daunting tasks, because they are designed for high performance computing, intensive numerical simulations, AI development, and big data management.
How to Pick the Right Server for AI? Part Two: Memory, Storage, and More

Tech Guide

How to Pick the Right Server for AI? Part Two: Memory, Storage, and More

The proliferation of tools and services empowered by artificial intelligence has made the procurement of “AI servers” a priority for organizations big and small. In Part Two of GIGABYTE Technology’s Tech Guide on choosing an AI server, we look at six other vital components besides the CPU and GPU that can transform your server into a supercomputing powerhouse.
CSR and ESG in Action: GIGABYTE Helps NCKU Train Award-Winning Supercomputing Team

Success Case

CSR and ESG in Action: GIGABYTE Helps NCKU Train Award-Winning Supercomputing Team

GIGABYTE Technology is not only a leading brand in high-performance server solutions—it is also an active force for good when it comes to CSR and ESG activities. Case in point: in 2020, GIGABYTE provided four G482-Z50 servers to Taiwan’s Cheng Kung University. The servers were used to train a team of talented students, who went on to take first place in that year’s APAC HPC-AI Competition in Singapore. The parallel computing performance of the servers’ processors, the seamless connectivity between the servers, and the servers’ unrivalled reliability are the reasons why GIGABYTE servers are ideal for educating the next generation of supercomputing experts. GIGABYTE is happy to give back to society and contribute to human advancement through high tech solutions.
NCHC and Xanthus Elevate Taiwanese Animation on the World Stage with GIGABYTE Servers

Success Case

NCHC and Xanthus Elevate Taiwanese Animation on the World Stage with GIGABYTE Servers

Created by Greener Grass Production, the Taiwanese sci-fi mini-series “2049” made its debut on Netflix and various local TV channels. The animated spin-off “2049+ Voice of Rebirth”, crafted by Xanthus Animation Studio, premiered on the streaming service myVideo. The CGI show was created with the NCHC Render Farm’s GIGABYTE servers, which employ top-of-the-line NVIDIA® graphics cards to empower artists with industry-leading rendering capabilities. The servers can take on multiple workloads simultaneously through parallel computing, and they boast a wide range of patented smart features that ensure stability and availability. With all it has going for it, “2049+ Voice of Rebirth” may garner enough attention to become the breakout hit that will introduce Taiwanese animation to international audiences.
Get the inside scoop on the latest tech trends, subscribe today!
Get Updates