🧀 BigCheese.ai

Social

AMD Open-Source 1B OLMo Language Models

🧀

AMD introduces its first series of 1 billion parameter language models, AMD OLMo, offering significant AI capabilities and fully open-source for community collaboration. These models are trained on AMD Instinct GPUs using trillions of tokens and demonstrate superior language understanding and reasoning.

  • AMD released the AMD OLMo language model series.
  • OLMo is trained on Instinct MI250 GPUs.
  • The models consist of 1 billion parameters.
  • The training used 1.3 trillion tokens.
  • AMD open-sourced OLMo's training details.