AIME GX8-B200 HGX Server

The AIME GX8-B200 is the HGX deep learning server based on the Gigabyte G893-ZD1-AAX5 barebone, it hosts 8 of the most advanced NVIDIA deep learning accelerators: the NVIDIA B200 SXM.

The AIME GX8-B200 is the ultimate deep learning accelerator and ai inference server. The AIME GX8-B200 delivers:

  • 8x NVIDIA B200 SXM with 180 GB HBM3e memory each, with a total of 1.440 GB (1,4 TB) GPU memory

  • Dual AMD EPYC Turin CPU performance with up to 2x 192 CPU cores with a total of 768 CPU threads

  • 1,8 TB/s NVLINK switch connectivity between all accelerators

  • up to 3072 GB (3TB) DDR5 memory, 12 channel memory with 24 equipped RAM slots

  • up to 8x native NVMe SSD drives with up to 210 TB SSD storage

  • 128 GB/s PCI 5.0 bus speeds between CPU memory and GPUs

  • up to 8x 400 GBit/s OSFP network connectivity for cluster computing with a total network through put of 400 GB/s

Built to perform 24/7 for most reliable high performance cluster computing. Either at your inhouse data center, co-location or as a hosted solution.

  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server back
  • AIME GX8-B200 HGX Server side
AIME GX8-B200 - Configuration
No. of GPUs
1x
GPU
NVIDIA B200 SXM 180GB
CPU
2 x EPYC 9555 (64 cores, Turin, 3.20 / 4.40 GHz)
Memory
1536 GB (24x64) DDR5 ECC 6400 MHz
1. SSD
1.92 TB 2.5" U.2 NVMe PCIe 5.0 (Kioxia CD8P-R)
2. SSD
None
RAID
None
Network
1 x 400 GBE OSFP (ConnectX-7)
Quote Type
Buy

Technical Details AIME GX8-B200

Type Rack Server 8U, 100cm depth
CPU (configurable) EPYC Turin
2x EPYC 9555 (64 cores, 3.20 / 4.40 GHz)
2x EPYC 9655 (96 cores, 2.60 / 4.50 GHz)
2x EPYC 9755 (128 cores, 2.70 / 4.10 GHz)
2x EPYC 9965 (192 cores, 2.25 / 3.70 GHz)
RAM 1536 / 2304 / 3072 GB DDR5 ECC memory
GPU Options 8x NVIDIA B200 SXM 180 GB
Cooling Motherboard & CPU fans:
2 x 60x60x56mm (24,600/24,200rpm)
4 x 60x60x76mm (21,800/20,500rpm)

PCIe slot fans:
4 x 80x80x56mm (15,500/14,900rpm)

GPU fans:
15 x 80x80x80mm (19,600/15,600rpm)
Storage Up to 8x 30.72 TB U.3 NVMe PCIe 5.0 SSD
Tripple Level Cell (TLC) quality
10.000 MB/s read, 4.900 MB/s write
MTBF of 2.500.000 hours and 5 years manufacturer's warranty with 1 DWPD
Network 2x IPMI LAN
2x 10 GBE LAN RJ45 (onboard)
up to 8x 400 GBE OSFP
USB 2 x USB 3.2 Gen1 Type-A (front)
PSU 6+6x 3000W redundant power
80 PLUS Titanium certified (96% efficiency)
Noise-Level 95dBA
Dimensions (WxHxD) 447mm x 351mm (8U) x 923mm
17.6" x 13.82" x 36.3"
Operating Environment Operating: 10°C to 30°C, 8% to 80% (non-condensing)
Non-operating: -40°C to 60°C, 20% to 95% (non-condensing)

AIME GX8-B200 HGX Server

The AIME GX8-B200 is the HGX deep learning server based on the Gigabyte G893-ZD1-AAX5 barebone, it hosts 8 of the most advanced NVIDIA deep learning accelerators: the NVIDIA B200 SXM.

The AIME GX8-B200 is the ultimate deep learning accelerator and ai inference server. The AIME GX8-B200 delivers:

  • 8x NVIDIA B200 SXM with 180 GB HBM3e memory each, with a total of 1.440 GB (1,4 TB) GPU memory

  • Dual AMD EPYC Turin CPU performance with up to 2x 192 CPU cores with a total of 768 CPU threads

  • 1,8 TB/s NVLINK switch connectivity between all accelerators

  • up to 3072 GB (3TB) DDR5 memory, 12 channel memory with 24 equipped RAM slots

  • up to 8x native NVMe SSD drives with up to 210 TB SSD storage

  • 128 GB/s PCI 5.0 bus speeds between CPU memory and GPUs

  • up to 8x 400 GBit/s OSFP network connectivity for cluster computing with a total network through put of 400 GB/s

Built to perform 24/7 for most reliable high performance cluster computing. Either at your inhouse data center, co-location or as a hosted solution.

Technical Details AIME GX8-B200

Type Rack Server 8U, 100cm depth
CPU (configurable) EPYC Turin
2x EPYC 9555 (64 cores, 3.20 / 4.40 GHz)
2x EPYC 9655 (96 cores, 2.60 / 4.50 GHz)
2x EPYC 9755 (128 cores, 2.70 / 4.10 GHz)
2x EPYC 9965 (192 cores, 2.25 / 3.70 GHz)
RAM 1536 / 2304 / 3072 GB DDR5 ECC memory
GPU Options 8x NVIDIA B200 SXM 180 GB
Cooling Motherboard & CPU fans:
2 x 60x60x56mm (24,600/24,200rpm)
4 x 60x60x76mm (21,800/20,500rpm)

PCIe slot fans:
4 x 80x80x56mm (15,500/14,900rpm)

GPU fans:
15 x 80x80x80mm (19,600/15,600rpm)
Storage Up to 8x 30.72 TB U.3 NVMe PCIe 5.0 SSD
Tripple Level Cell (TLC) quality
10.000 MB/s read, 4.900 MB/s write
MTBF of 2.500.000 hours and 5 years manufacturer's warranty with 1 DWPD
Network 2x IPMI LAN
2x 10 GBE LAN RJ45 (onboard)
up to 8x 400 GBE OSFP
USB 2 x USB 3.2 Gen1 Type-A (front)
PSU 6+6x 3000W redundant power
80 PLUS Titanium certified (96% efficiency)
Noise-Level 95dBA
Dimensions (WxHxD) 447mm x 351mm (8U) x 923mm
17.6" x 13.82" x 36.3"
Operating Environment Operating: 10°C to 30°C, 8% to 80% (non-condensing)
Non-operating: -40°C to 60°C, 20% to 95% (non-condensing)

  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server front open
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server top
  • AIME GX8-B200 HGX Server back
  • AIME GX8-B200 HGX Server side
AIME GX8-B200 - Configuration
No. of GPUs
1x
GPU
NVIDIA B200 SXM 180GB
CPU
2 x EPYC 9555 (64 cores, Turin, 3.20 / 4.40 GHz)
Memory
1536 GB (24x64) DDR5 ECC 6400 MHz
1. SSD
1.92 TB 2.5" U.2 NVMe PCIe 5.0 (Kioxia CD8P-R)
2. SSD
None
RAID
None
Network
1 x 400 GBE OSFP (ConnectX-7)
Quote Type
Buy
We would be happy to advise you personally.
+49 30 459 54 380
Manuel Glende
AIME Sales Team
Manuel Glende
0,00 €
Price excluding VAT
  • Ships in 30-60 Days
  • 3 Years warranty included
  • Runs 24/7 for years, no overheating