Giga Computing Collaborates with Industry Leaders to Advance AI & HPC Performance at Supercomputing 2025
Nov 18, 2025

November 18, 2025 - Giga Computing, a subsidiary of GIGABYTE and a recognized leader in AI hardware innovation and advanced cooling technologies, is showcasing its high-performance servers and rack solutions optimized for AI and HPC workloads at Supercomputing 2025 (SC25). As North America’s premier event for high-performance computing, SC25 brings together researchers, industry experts, and government organizations to explore the technologies transforming science, industry, and society. At the GIGABYTE booth (#1117), attendees can experience the full spectrum of computing innovations, from compact AI PCs to large-scale infrastructure systems, while exploring how Giga Computing and its partners are facilitating breakthroughs through advanced hardware design and collaborative innovation.

GIGABYTE GIGAPOD: rack-scale infrastructure

At the GIGABYTE booth, attendees can discover two complete liquid-cooled server racks that are being deployed as part of GIGAPOD. GIGABYTE empowers technology leaders with supercomputing infrastructure built on high-performance GPU servers accelerated by NVIDIA HGX B300 or AMD Instinct™ MI350 Series GPUs. As a one-stop solution for AI-driven data centers, GIGAPOD provides the hardware, expertise, and partnerships to deploy large-scale deep learning systems efficiently and with minimal downtime. Relying on key global leaders in the liquid cooling industry has led to successful deployments of GIGAPOD, and discussions are being held at the GIGABYTE booth.

AMD-based Solutions:

  • G4L3-ZX1: compact, compute dense, liquid-cooled AI server for AMD EPYC™ 9005 Series CPUs and AMD Instinct MI350 Series GPUs.
  • G893-ZX1: air-cooled AI server for AMD EPYC 9005 CPUs, AMD Instinct MI350 Series GPUs, and AMD Pensando™ Pollara 400 AI NICs. Delivering an energy-efficient and scalable AI-ready platform.
  • B353-C60: liquid-cooled server with 20 nodes using AMD EPYC 4005 CPUs in a 3U chassis.
  • W793-ZU0: liquid-cooled workstation that pairs the EPYC platform with four GPUs: AMD Radeon™ AI PRO R9700 or AMD Instinct™ MI210.
  • R283-ZK0: air-cooled server with industry leading memory density, 48 DIMMs on one node using dual AMD EPYC 9005 CPUs.
  • B343-C40: 3U 10-node air-cooled system using AMD EPYC 4005 series processors.

Intel-based Solutions:

  • G494-SB4: 4U air-cooled GPU server using dual Intel® Xeon® 6700 Series CPUs and (validated for) Intel® Gaudi® 3 AI Accelerator PCIe GPUs. Able to run large-scale AI workloads with low total cost of ownership.
  • G894-SD3: air-cooled AI server Intel Xeon 6700/6500 Series CPUs and NVIDIA HGX™ B300.
  • R284-A91: Xeon 6 server with sixteen E3.S Gen5 bays for CXL® memory expansion modules.
  • B343-X40: 3U 10-node air-cooled system using Intel® Xeon® 6300 series processors.

NVIDIA-based Solutions:

Ampere-based Solutions:

  • R1A3-T40: 1U server for AmpereOne® M processors. Optimized for AI inference and small language models using a CPU with up to 192 Arm-based cores.

Join us at SC25 (Booth #1117) to experience the future of intelligent computing firsthand. Connect with our experts, explore live demonstration of AI TOP ATOM, and discover how Giga Computing and GIGABYTE are shaping the next era of performance, efficiency, and innovation in data center technology.

For queries or more information, please contact sales.

{{AsideSubscribeInfo.buttonText}}