Elon Musk’s xAI Builds World’s Largest AI Supercomputer ‘Colossus’ With NVIDIA in 122 Days Featuring 100,000 NVIDIA Hopper GPUs, Will Use It To Train Grok AI Chabot

Elon Musk's xAI chatbot built 'Colossus', world's largest AI supercomputer featuring 1,00,000 NVIDIA Hopper GPUs. xAI will use Colossus to train its Grok AI chatbot.

NVIDIA, xAI Logos (Photo Credits: X/@Nvidia)

Elon Musk's xAI unveiled world's largest AI supercomputer called "Colossus" in Memphis in collaboration with NVIDIA. The new AI supercomputer Colossus is built in 122 days and comprises 1,00,000 NVIDIA Hopper GPUs. xAI built this computer by NVIDIA Spectrum-X  Ethernet networking platform which is designed to offer superior performance to multi-tenant, hyperscale AI factories using standards-based Ethernet, for its Remote Direct Memory Access (RDMA) network. xAI's Colossus will be used by the company to train xAI’s Grok family of large language models which are currently offered as a feature for X Premium subscribers. NVIDIA said, "xAI is in the process of doubling the size of Colossus to a combined total of 200,000 NVIDIA Hopper GPUs." Elon Musk Says Neuralink Should Prioritse Making Implant To Eliminate Back and Neck Pain To Greatly Improve Happiness of People.

NVIDIA, Elon Musk's xAI Announced World's Largest AI Supercomputer 'Colossus' to Train Grok AI

(SocialLY brings you all the latest breaking news, viral trends and information from social media world, including Twitter (X), Instagram and Youtube. The above post is embeded directly from the user's social media account and LatestLY Staff may not have modified or edited the content body. The views and facts appearing in the social media post do not reflect the opinions of LatestLY, also LatestLY does not assume any responsibility or liability for the same.)

Share Now