ANT PC Xpertstation WS300

Powered by the NVIDIA GB300 Grace Blackwell Ultra Superchip and 748GB memory, XpertStation WS300 delivers data-center AI performance at your desk.

Now accepting early interest.

Be among the first to experience next-gen AI performance.

Register Now

What is the XpertStation WS300?

This isn’t just a workstation—it’s a deskside AI powerhouse built to replace your dependence on cloud and data centers. Powered by the NVIDIA GB300 Grace Blackwell Ultra Superchip, the XpertStation WS300 delivers extreme AI compute, massive unified memory, and ultra-fast networking—all in a compact form factor designed for serious AI workloads.

With 748GB of coherent memory and dual 400GbE connectivity, you’re no longer limited by infrastructure. Train larger models, process bigger datasets, and deploy faster—all from your desk.

At its core, a 72-core NVIDIA Grace CPU works seamlessly with the Blackwell Ultra GPU, pushing up to 1,400W of raw AI performance. This tightly integrated architecture eliminates bottlenecks and unlocks true high-efficiency AI acceleration.

And the memory? A staggering combination of 496GB LPDDR5X + 252GB HBM3e—giving you cluster-level memory capacity in a single system. What typically requires racks of servers now fits under your desk.

What Sets XpertStation WS300 Apart

Compute-Dense Architecture

The NVIDIA GB300 Superchip unites Grace CPU and Blackwell Ultra GPU via NVLink C2C for high-density,efficient computing.

High-Speed Networking

NVIDIA ConnectX-8 SuperNIC delivers up to 800Gb/s ultra-fast, low-latency connectivity for scalable AI workloads.

Large Coherent Memory

748GB unified memory (HBM3e + LPDDR5X) via C2C enables faster data access and efficient AI training.

End-to-End AI Software Support

NVIDIA AI Software Stack enables seamless AI development—from fine-tuning to deployment, desktop to data center.

XpertStation WS300

  • NVIDIA Blackwell Ultra GPU
  • CPU Memory Up to 496GB LPDDR5X | Up to 396GB/s GPU Memory Up to 252GB HBM3e | 7.1TB/s
  • 10 GbE RJ45 port (Marvel AQC113) 400 GbE QSFP ports (NVIDIA ConnectX®-8)
  • 2280 PCIe 5.0 x2 NVMe M.2 ports (from CPU) 2280 PCIe 6.0 x4 NVMe M.2 ports (for training data)
  • 1600W 80 PLUS Titanium ATX PSU

What industries benefit most?

Accelerating Every AI Workload

MSI’s XpertStation WS300 on NVIDIA DGX Station™ architecture brings data-center-class AI performance to the desktop, enabling model development, data science, and autonomous AI agents with NVIDIA OpenShell.

AI Development

Accelerate deep learning and machine learning training for AI applications ranging from predictive maintenance and medical imaging analysis to natural language processing.

Data Science

Accelerate end-to-end data science workflows to enable faster data ingestion, analysis, and insight generation across massive datasets.

AI Inference

Accelerate local inference for large and complex AI models, delivering high-speed performance for LLM token generation, data analysis, content creation, and AI chatbots.

Personal Cloud

Run advanced AI models on local data and serve as a centralized compute node for team-based fine-tuning and on-demand deployment.

Real-world impact:

  • Train larger models without memory fragmentation
  • Faster inference pipelines
  • Reduced latency in data-heavy workflows
  • Ultra-fast data transfer
  • Cluster connectivity
  • Distributed AI workloads

Talk to an AI Infrastructure Expert Today

Register Now


FAQs

  • Is WS300 better than RTX 6000 Ada workstations?

    Yes—for AI workloads. It offers significantly higher memory and compute.

  • Can it replace a data center?

    Partially. It can handle many workloads locally but not hyperscale operations.

  • Is it suitable for video editing?

    No. It is optimized for AI, not media workflows.

  • How many users can share it?

    Multiple users can access via network or virtualization.

  • Does it support clustering?

    Yes, via 400GbE networking.

  • Is it future-proof?

    Highly—thanks to PCIe Gen6 and Blackwell architecture.

  • What industries benefit most?

    AI startups, research labs, healthcare, finance, defense, and tech enterprises.