AIME GX8-B200 HGX Server

The AIME GX8-B200 is the HGX deep learning server based on the Gigabyte G893-ZD1-AAX5 barebone, it hosts 8 of the most advanced NVIDIA deep learning accelerators: the NVIDIA B200 SXM.

The AIME GX8-B200 is the ultimate deep learning accelerator and ai inference server. The AIME GX8-B200 delivers:

  • Dual AMD EPYC Turin CPU performance with up to 2x 192 CPU cores with a total of 768 CPU threads

  • 8x NVIDIA B200 SXM with 180 GB HBM3e memory each, with a total of 1.440 GB (1,4 TB) GPU memory

  • 900 GB/s NVLINK switch connectivity between all accelerators

  • up to 3072 GB (3TB) DDR5 memory, 12 channel memory with 24 equipped RAM slots

  • up to 8x native NVMe SSD drives with up to 210 TB SSD storage

  • 128 GB/s PCI 5.0 bus speeds between CPU memory and GPUs

  • up to 8x 400 GBit/s OSFP network connectivity for cluster computing with a total network through put of 400 GB/s

Built to perform 24/7 for most reliable high performance cluster computing. Either at your inhouse data center, co-location or as a hosted solution.

  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server back
  • AIME GX8-B200 HGX Server side
AIME GX8-B200 - Configuration
No. of GPUs
1x
GPU
NVIDIA B200 SXM 180GB
CPU
2 x EPYC 9555 (64 cores, Turin, 3.20 / 4.40 GHz)
Memory
1536 GB (24x64) DDR5 ECC 6400 MHz
1. SSD
1.92 TB 2.5" U.2 NVMe PCIe 5.0 (Kioxia CD8P-R)
2. SSD
None
RAID
None
Network
1 x 400 GBE OSFP (ConnectX-7)

AIME GX8-B200 HGX Server

The AIME GX8-B200 is the HGX deep learning server based on the Gigabyte G893-ZD1-AAX5 barebone, it hosts 8 of the most advanced NVIDIA deep learning accelerators: the NVIDIA B200 SXM.

The AIME GX8-B200 is the ultimate deep learning accelerator and ai inference server. The AIME GX8-B200 delivers:

  • Dual AMD EPYC Turin CPU performance with up to 2x 192 CPU cores with a total of 768 CPU threads

  • 8x NVIDIA B200 SXM with 180 GB HBM3e memory each, with a total of 1.440 GB (1,4 TB) GPU memory

  • 900 GB/s NVLINK switch connectivity between all accelerators

  • up to 3072 GB (3TB) DDR5 memory, 12 channel memory with 24 equipped RAM slots

  • up to 8x native NVMe SSD drives with up to 210 TB SSD storage

  • 128 GB/s PCI 5.0 bus speeds between CPU memory and GPUs

  • up to 8x 400 GBit/s OSFP network connectivity for cluster computing with a total network through put of 400 GB/s

Built to perform 24/7 for most reliable high performance cluster computing. Either at your inhouse data center, co-location or as a hosted solution.

  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server back
  • AIME GX8-B200 HGX Server side
AIME GX8-B200 - Configuration
No. of GPUs
1x
GPU
NVIDIA B200 SXM 180GB
CPU
2 x EPYC 9555 (64 cores, Turin, 3.20 / 4.40 GHz)
Memory
1536 GB (24x64) DDR5 ECC 6400 MHz
1. SSD
1.92 TB 2.5" U.2 NVMe PCIe 5.0 (Kioxia CD8P-R)
2. SSD
None
RAID
None
Network
1 x 400 GBE OSFP (ConnectX-7)
We would be happy to advise you personally.
+49 30 459 54 380
Manuel Glende
AIME Sales Team
Manuel Glende
0,00 €
Price excluding VAT
  • 3 Years warranty included
  • Runs 24/7 for years, no overheating