CVE-2026-42294
Memory Exhaustion in Argo Workflows Webhook Interceptor
Publication date: 2026-05-09
Last updated on: 2026-05-09
Assigner: GitHub, Inc.
Description
Description
CVSS Scores
EPSS Scores
| Probability: | |
| Percentile: |
Meta Information
Affected Vendors & Products
| Vendor | Product | Version / Range |
|---|---|---|
| argoproj | argo_workflows | to 4.0.5 (exc) |
Helpful Resources
Exploitability
| CWE ID | Description |
|---|---|
| CWE-770 | The product allocates a reusable resource or group of resources on behalf of an actor without imposing any intended restrictions on the size or number of resources that can be allocated. |
Attack-Flow Graph
AI Powered Q&A
Can you explain this vulnerability to me?
The vulnerability in Argo Workflows involves the Webhook Interceptor loading the entire request body into memory before authenticating or verifying its signature on the /api/v1/events/ endpoint.
Because this endpoint is publicly accessible, an attacker can send a request with an extremely large body (such as multiple gigabytes), causing the server to allocate excessive memory.
This can lead to an Out-Of-Memory (OOM) crash and denial of service (DoS) for the Argo Server.
The issue was fixed by limiting the size of webhook request bodies to 2MB, rejecting requests that exceed this limit.
How can this vulnerability impact me? :
This vulnerability can be exploited to cause a denial of service (DoS) by sending very large webhook requests to the Argo Server.
The server may consume excessive memory and crash due to Out-Of-Memory conditions, disrupting workflow orchestration and availability.
This can lead to downtime and loss of service for applications relying on Argo Workflows.
How can this vulnerability be detected on my network or system? Can you suggest some commands?
This vulnerability involves the Argo Workflows Webhook Interceptor accepting excessively large request bodies on the /api/v1/events/ endpoint, which can lead to memory exhaustion and denial of service.
To detect potential exploitation attempts on your network or system, you can monitor incoming HTTP requests to the /api/v1/events/ endpoint for unusually large payloads exceeding normal webhook sizes (e.g., larger than 2MB).
Suggested commands include using network monitoring or logging tools to filter and identify large POST requests to this endpoint. For example:
- Using tcpdump to capture HTTP POST requests to /api/v1/events/: tcpdump -i <interface> -A -s 0 'tcp port 80 and (((ip[2:2] - ((ip[0]&0xf)<<2)) - ((tcp[12]&0xf0)>>2)) > 2000000)'
- Using curl or similar tools to test the endpoint with large payloads (for controlled testing): curl -X POST --data-binary @largefile.json http://<argo-server>/api/v1/events/
Additionally, reviewing Argo Server logs for errors or crashes related to memory exhaustion or OOM events can help detect exploitation attempts.
What immediate steps should I take to mitigate this vulnerability?
The primary mitigation is to upgrade Argo Workflows to version 3.7.14 or 4.0.5 or later, where the vulnerability has been patched.
The patch limits the size of webhook request bodies to 2MB, rejecting requests that exceed this limit with a 403 status code, preventing excessive memory allocation.
If immediate upgrade is not possible, consider implementing network-level protections such as:
- Configuring a reverse proxy or firewall to limit the size of incoming requests to the /api/v1/events/ endpoint.
- Rate limiting or blocking suspicious IP addresses sending large payloads.
Monitoring system memory usage and setting alerts for unusual spikes can also help detect and respond to attacks early.
How does this vulnerability affect compliance with common standards and regulations (like GDPR, HIPAA)?:
The vulnerability in Argo Workflows allows an attacker to cause a denial-of-service (DoS) by exhausting server memory through oversized webhook payloads. While this primarily impacts availability, it does not directly disclose or compromise sensitive data.
However, denial-of-service incidents can affect compliance with standards like GDPR and HIPAA, which require ensuring availability and resilience of systems processing personal or protected health information. An attacker exploiting this vulnerability could disrupt service availability, potentially violating these regulations' requirements for maintaining system uptime and reliability.
Therefore, organizations using affected versions of Argo Workflows should apply the patch to mitigate the risk and maintain compliance with such standards.