CVE-2026-7482
Awaiting Analysis Awaiting Analysis - Queue
Heap Out-of-Bounds Read in Ollama

Publication date: 2026-05-04

Last updated on: 2026-05-04

Assigner: abd028dc-c042-4c4d-9749-38d0f850af89

Description
Ollama before 0.17.1 contains a heap out-of-bounds read vulnerability in the GGUF model loader. The /api/create endpoint accepts an attacker-supplied GGUF file in which the declared tensor offset and size exceed the file's actual length; during quantization in fs/ggml/gguf.go and server/quantization.go (WriteTo()), the server reads past the allocated heap buffer. The leaked memory contents may include environment variables, API keys, system prompts, and concurrent users' conversation data, and can be exfiltrated by uploading the resulting model artifact through the /api/push endpoint to an attacker-controlled registry. The /api/create and /api/push endpoints have no authentication in the upstream distribution. Default deployments bind to 127.0.0.1, but the documented OLLAMA_HOST=0.0.0.0 configuration is widely used in practice (large public-internet exposure observed).
CVSS Scores
EPSS Scores
Probability:
Percentile:
Meta Information
Published
2026-05-04
Last Modified
2026-05-04
Generated
2026-05-07
AI Q&A
2026-05-05
EPSS Evaluated
2026-05-05
NVD
Affected Vendors & Products
Showing 1 associated CPE
Vendor Product Version / Range
ollama ollama to 0.17.1 (exc)
Helpful Resources
Exploitability
CWE
CWE Icon
KEV
KEV Icon
CWE ID Description
CWE-125 The product reads data past the end, or before the beginning, of the intended buffer.
Attack-Flow Graph
AI Powered Q&A
Can you explain this vulnerability to me?

This vulnerability exists in Ollama versions before 0.17.1 and involves a heap out-of-bounds read in the GGUF model loader. Specifically, the /api/create endpoint accepts a crafted GGUF file where the declared tensor offset and size exceed the actual file length. During quantization, the server reads beyond the allocated heap buffer, causing it to leak memory contents.

The leaked memory may include sensitive information such as environment variables, API keys, system prompts, and conversation data from concurrent users. An attacker can exfiltrate this leaked data by uploading the resulting model artifact through the /api/push endpoint to a registry they control.

Notably, the /api/create and /api/push endpoints lack authentication in the upstream distribution. Although default deployments bind to localhost (127.0.0.1), many use the OLLAMA_HOST=0.0.0.0 configuration, exposing the service widely on the public internet.


How can this vulnerability impact me? :

This vulnerability can lead to unauthorized disclosure of sensitive information stored in memory, including environment variables, API keys, system prompts, and other users' conversation data.

An attacker can exploit this flaw remotely without authentication by sending a malicious GGUF file to the /api/create endpoint and then exfiltrating the leaked data via the /api/push endpoint.

If your deployment uses the OLLAMA_HOST=0.0.0.0 setting, the service may be exposed to the public internet, increasing the risk of exploitation.

Overall, this can result in data breaches, loss of confidentiality, and potential compromise of your system's security.


How does this vulnerability affect compliance with common standards and regulations (like GDPR, HIPAA)?:

This vulnerability allows an attacker to exfiltrate sensitive information such as environment variables, API keys, system prompts, and concurrent users' conversation data by exploiting a heap out-of-bounds read in the GGUF model loader. Since the leaked data may include personal or sensitive information, this could lead to non-compliance with data protection regulations like GDPR and HIPAA, which require the protection of personal and sensitive data against unauthorized access and disclosure.

Furthermore, the lack of authentication on the /api/create and /api/push endpoints and the common practice of exposing the service to the public internet increase the risk of unauthorized data access, exacerbating compliance risks.


Ask Our AI Assistant
Need more information? Ask your question to get an AI reply (Powered by our expertise)
0/70
EPSS Chart