Resolved
The issues affecting meta-llama/llama-4-scout-17b-16e-instruct have been fully resolved. The model is operating normally. We apologize for the disruption and thank you for your patience.
Monitoring
We’re continuing to monitor the scout model to ensure stability persists. If all remains normal, we will resolve the incident in the next update.
Monitoring
We have implemented a fix for meta-llama/llama-4-scout-17b-16e-instruct and performance is improving. We’re now monitoring the model to ensure stability. If all remains normal, we will resolve the incident in the next update.
Identified
We have identified the problem affecting the meta-llama/llama-4-scout-17b-16e-instruct. A fix is in progress, users may still see delayed responses and errors from Scout until the fix completes.
Investigating
We are currently investigating reports of an issue with meta-llama/llama-4-scout-17b-16e-instruct. Users may be experiencing elevated error rates and or slow responses. Other models remain operational. Our team is analyzing logs and infrastructure to identify the cause.
Groq Status provides transparent, real-time visibility into GroqCloud’s system health, uptime metrics, and service availability.