Back to Models
Llama 4 scout
Llama 4 Scout is a 109B parameter (17B active) MoE model. It is designed for efficiency and visual reasoning with a 328k context.
Parameters
109000000000 B
Context
327,680 tokens
Released
May 4, 2025
Leaderboards
Average Score combining domain-specific Autobench scores; Higher is better
Performance vs. Industry Average
Context Window
Llama 4 scout has a smaller context window than average (406k tokens), with a context window of 328k tokens.