Supermicro

SYS-422GA-NRT-01-G2

DP Intel 4U Dual-Root PCIe GPU System with up to 8 GPUs

Manufacturer: Supermicro

The Supermicro Gold Series High-Density Enterprise AI Inference Server is designed for organizations operating AI inference at scale across shared, multi-tenant platforms. Built for PaaS environments and large enterprise AI services, this Gold Series server delivers high GPU density, massive memory bandwidth, and predictable performance under sustained, concurrent inference workloads.

Pre-configured and pre-validated by Supermicro, this server enables teams to deploy production AI infrastructure quickly, without the complexity and risk of custom hardware design, while supporting the throughput and reliability required by platform-grade AI services.

SYS-422GA-NRT-01-G2 Key Differentiators for AI Service Providers

- High-Density GPU Inference: Four NVIDIA RTX PRO 6000 Blackwell SE GPUs deliver exceptional inference density for shared and multi-tenant workloads

- Massive Memory Capacity: 1TB of DDR5 memory supports large model hosting, fast batching, and high request concurrency

- Pre-Configured & Pre-Validated: Delivered fully assembled and tested to reduce deployment risk and accelerate time to production

- Enterprise-Grade Power & Reliability: Redundant Titanium-level power supplies support continuous operation in always-on environments

- Platform-Ready Design: Built to integrate cleanly into distributed, scalable AI platforms

AI Inference vs Edge AI Servers ’ Supermicro Gold Series Guide
DP Intel 4U Dual-Root PCIe GPU System with up to 8 GPUs
Availability
Availability
Available stock0
?More in reserve stockWe may have stock in reserve for a project. Ask us, And we'll see if we can make it available for you.No
?Standard lead timeLead times are subject to change. Contact us for a current estimated lead time for backorders.Contact Us
Your Price
Add to Cart
Contact Us for a Quote