Resolved
The issues affecting meta-llama/llama-4-scout-17b-16e-instruct have been resolved. The model is once again operating normally. We apologize for the disruption and thank you for your patience.
Monitoring
meta-llama/llama-4-scout-17b-16e-instruct and performance is improving. We’re now monitoring the model to ensure stability. If all remains normal, we will resolve the incident in the next update.
Investigating
We are currently investigating reports of an issue with meta-llama/llama-4-scout-17b-16e-instruct. Users may be experiencing elevated error rates and or slow responses. Other models remain operational. Our team is analyzing logs and infrastructure to identify the cause.
Groq Status provides transparent, real-time visibility into GroqCloud’s system health, uptime metrics, and service availability.