CVE-2025-71008
BaseFortify
Publication date: 2026-01-29
Last updated on: 2026-02-03
Assigner: MITRE
Description
Description
CVSS Scores
EPSS Scores
| Probability: | |
| Percentile: |
Meta Information
Affected Vendors & Products
| Vendor | Product | Version / Range |
|---|---|---|
| oneflow | oneflow | 0.9.0 |
Helpful Resources
Exploitability
| CWE ID | Description |
|---|---|
| CWE-NVD-CWE-noinfo |
Attack-Flow Graph
AI Powered Q&A
Can you explain this vulnerability to me?
This vulnerability is a segmentation fault in the OneFlow deep learning framework, specifically in the method oneflow._oneflow_internal.autograd.Function.FunctionCtx.mark_non_differentiable. When this method is called on a tensor outside the context of a custom autograd function, it causes unsafe memory access leading to a segmentation fault and program crash (core dump). Essentially, the function is misused outside its intended context, which triggers this fault. [1]
How can this vulnerability impact me? :
This vulnerability can cause a Denial of Service (DoS) by crashing the program using OneFlow when the faulty function is called with crafted input. This means an attacker could cause the application to stop functioning by triggering the segmentation fault, leading to service interruptions. [1]
How can this vulnerability be detected on my network or system? Can you suggest some commands?
This vulnerability can be detected by running a test that calls the vulnerable function outside its intended context to see if it causes a segmentation fault. For example, executing the following Python code snippet in an environment with OneFlow v0.9.0 installed will reproduce the issue and cause a crash with a core dump: ```python import oneflow as flow ctx = flow._oneflow_internal.autograd.Function.FunctionCtx() ctx.mark_non_differentiable(flow.ones((1, 1))) ``` Monitoring for segmentation faults or core dumps when running this code indicates the presence of the vulnerability. [1]
What immediate steps should I take to mitigate this vulnerability?
Immediate mitigation involves avoiding calling `mark_non_differentiable` outside the context of a custom autograd function in OneFlow. Ensure that any use of `mark_non_differentiable` is strictly within a properly defined custom autograd function context to prevent unsafe memory access and segmentation faults. Additionally, consider updating OneFlow to a version where this issue is fixed once available. [1]