🧀 BigCheese.ai

Social

Serving AI from the Basement – 192GB of VRAM Setup

🧀

Ahmad Osman shares his experience of creating a powerful AI server in his basement, equipped with 8 RTX 3090 GPUs and 192GB of VRAM to run Meta's Llamma-3.1 405B. The server features a high-end motherboard, CPU, and substantial memory. Future blog posts will cover assembly challenges, PCIe technology, benchmarks, and AI model training.

  • 192GB VRAM from 8 RTX 3090 GPUs.
  • AMD Epyc Milan 7713 CPU onboard.
  • 512GB DDR4-3200 3DS RDIMM memory.
  • Meta's Llamma-3.1 405B AI model.
  • 112GB/s NVLink data transfer rate.