Elon Musk says Colossus2 is training 7 models simultaneously, with a maximum of 100 trillion parameters.

robot
Abstract generation in progress

According to 1M AI News monitoring, SpaceX and xAI founder Elon Musk posted on X, saying that SpaceXAI supercomputing cluster Colossus 2 is currently training seven models at the same time:

  1. Imagine V2, the next-generation version of the image and video generation model
  2. Two variants of a 1-trillion-parameter model
  3. Two variants of a 1.5-trillion-parameter model
  4. A 6-trillion-parameter model
  5. A 10-trillion-parameter model

He added the note, “Still some catching up to do” (Some catching up to do). Previously, multiple media outlets reported that xAI’s next-generation flagship model Grok 5 has a parameter scale of about 6 trillion, which matches the 6T in the list. The 10-trillion-parameter model has not been reported publicly before.

In the post, Musk refers to the combined entity of SpaceX and xAI as “SpaceXAI.” The two companies completed the merger in February this year, and the combined valuation is about $1.25 trillion. Colossus 2 went live on January 17. It is the world’s first gigawatt-class AI training cluster, and it is planned to be upgraded to 1.5GW this month.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments