Back to Models
Olmo 3.1 32b Think
A large-scale, 32-billion-parameter model designed for deep reasoning, complex multi-step logic, and advanced instruction following.
Thinking Mode
Parameters
32000000000 B
Context
65,536 tokens
Released
Invalid Date
Leaderboards
Average Score combining domain-specific Autobench scores; Higher is better
Performance vs. Industry Average
Context Window
Olmo 3.1 32b Think has a smaller context window than average (406k tokens), with a context window of 66k tokens.