CVE-2026-21720
Goroutine Leak in Grafana /avatar/ Handler Causes Memory Exhaustion
Publication date: 2026-01-27
Last updated on: 2026-02-17
Assigner: Grafana Labs
Description
Description
CVSS Scores
EPSS Scores
| Probability: | |
| Percentile: |
Meta Information
Affected Vendors & Products
| Vendor | Product | Version / Range |
|---|---|---|
| grafana | grafana | From 12.0.0 (inc) to 12.0.8 (exc) |
| grafana | grafana | From 12.1.0 (inc) to 12.1.5 (exc) |
| grafana | grafana | From 12.2.0 (inc) to 12.2.3 (exc) |
| grafana | grafana | From 3.0.0 (inc) to 11.6.9 (exc) |
| grafana | grafana | From 12.0.0 (inc) to 12.0.8 (exc) |
| grafana | grafana | From 12.1.0 (inc) to 12.1.5 (exc) |
| grafana | grafana | From 12.2.0 (inc) to 12.2.3 (exc) |
| grafana | grafana | From 3.0.0 (inc) to 11.6.9 (exc) |
| grafana | grafana | 12.3.0 |
| grafana | grafana | 12.3.0 |
Helpful Resources
Exploitability
| CWE ID | Description |
|---|---|
| CWE-UNKNOWN |
Attack-Flow Graph
AI Powered Q&A
How can this vulnerability impact me? :
The vulnerability can lead to memory exhaustion due to an increasing number of blocked goroutines, which can cause Grafana to crash on affected systems. This can result in denial of service, disrupting monitoring and visualization services provided by Grafana.
Can you explain this vulnerability to me?
This vulnerability occurs because every uncached /avatar/:hash request in Grafana spawns a goroutine to refresh the Gravatar image. If the refresh task waits longer than three seconds in a 10-slot worker queue, the handler times out and stops listening for the result. However, the goroutine remains blocked forever trying to send on an unbuffered channel, causing the number of goroutines to grow linearly with sustained traffic. This eventually exhausts memory and can cause Grafana to crash on some systems.
How can this vulnerability be detected on my network or system? Can you suggest some commands?
You can detect this vulnerability by monitoring the number of goroutines in the Grafana process. A steadily increasing goroutine count indicates the issue. Additionally, observing Grafana crashes or high memory usage under sustained traffic with random /avatar/:hash requests can be a sign. Specific commands include using 'ps' or 'top' to monitor Grafana memory usage, and 'curl' or similar tools to simulate /avatar/:hash requests to test behavior. However, no exact commands are provided in the available information.
What immediate steps should I take to mitigate this vulnerability?
Immediate mitigation steps include reducing or blocking traffic that generates uncached /avatar/:hash requests with random hashes to prevent goroutine buildup. Limiting the rate of such requests or applying network-level filtering can help. Restarting the Grafana service may temporarily alleviate memory exhaustion caused by the goroutine leak. No specific patches or configuration changes are detailed in the provided information.