You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running inference on an ONNX model with ONNXRuntime, I observe inconsistent results between sessions, even when using the same input data. The inconsistency appears when optimizations are enabled (with certain optimization levels). When optimization is disabled (ORT_DISABLE_ALL), the results are consistent.
Describe the issue
When running inference on an ONNX model with ONNXRuntime, I observe inconsistent results between sessions, even when using the same input data. The inconsistency appears when optimizations are enabled (with certain optimization levels). When optimization is disabled (ORT_DISABLE_ALL), the results are consistent.
To reproduce
Urgency
No response
Platform
Linux
OS Version
Ubuntu 20.04
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
5c1b7cc
ONNX Runtime API
Python
Architecture
X64
Execution Provider
CUDA
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: