Kimi K2 Instruct (via Groq)

1T Parameter MoE Model with 32B Activated Parameters - Fast Inference

Model Highlights

Exceptional Performance

SOTA on coding (SWE-bench 65.8%), math (AIME 2024 69.6%), and tool use

Agentic Intelligence

Designed for tool use, reasoning, and autonomous problem-solving

MuonClip Optimizer

Trained on 15.5T tokens with zero training instability

Chat with Kimi K2
Experience the power of a 1T parameter model with state-of-the-art capabilities

Start a conversation with Kimi K2

Try asking about coding, math, or complex reasoning tasks!

Try these examples: