1 article found in this topic.
Prime Intellect has launched INTELLECT-3, a 106-billion-parameter Mixture-of-Experts (MoE) model, open-sourcing its complete training pipeline and proprietary RL technology stack. This initiative aims to foster broader research and development in large-scale reinforcement learning, offering advanced model post-training capabilities.