GPT OSS 120b

GPT-OSS 120B is OpenAI's flagship open source model, built on a Mixture-of-Experts (MoE) architecture with 20 billion parameters and 128 experts.

· View details
Official
Compare Bots Read Aloud Stop Reading Rewrite Copy Dislike Report Ask Questions