Skip to content

[Bug]: "Input names mismatch" error creating InferenceSession with OpenVINO EP, while CPU EP works #32845

@20Pengda

Description

@20Pengda

OpenVINO Version

2025.3.0

Operating System

Ubuntu 25.04

Device used for inference

GPU

Framework

ONNX

Model used

OpenVLA

Issue description

  1. System Information
  • OS & Version: [Ubuntu 25.04]
  • ONNX Runtime version: [onnxruntime-openvino==1.23.0]
  • Python version: [3.10.19]
  • OpenVINO version [openvino==2025.03]
  1. Description
    When attempting to create an onnxruntime.InferenceSession for a specific ONNX model, the initialization fails if the OpenVINOExecutionProvider is specified. The exact same model loads and runs correctly when using the default CPUExecutionProvider.

The error occurs during the session initialization phase and points to a mismatch in tensor names between the ONNX graph and the subgraph compiled by the OpenVINO backend.

Image

Step-by-step reproduction

import numpy as np

model_path = "[path/to/your/model.onnx]" # Specify the path to the problematic model

print(f"ONNX Runtime version: {ort.__version__}")
print(f"Available providers: {ort.get_available_providers()}")

# --- Test Case 1: CPUExecutionProvider (This works) ---
try:
    print("\nAttempting to load model with CPUExecutionProvider...")
    cpu_session = ort.InferenceSession(model_path, providers=['CPUExecutionProvider'])
    print("SUCCESS: Model loaded successfully with CPUExecutionProvider.")
except Exception as e:
    print(f"FAILURE: An unexpected error occurred with CPUExecutionProvider: {e}")
 # --- Test Case 2: OpenVINOExecutionProvider (This fails) ---
try:
    print("\nAttempting to load model with OpenVINOExecutionProvider...")
    # Optional: session options can be configured if needed
    # session_options = ort.SessionOptions()
    
    openvino_session = ort.InferenceSession(
        model_path,
        providers=[('OpenVINOExecutionProvider', {'device_type' : 'GPU.1'})]
    )
    print("SUCCESS: Model loaded successfully with OpenVINOExecutionProvider.")
except Exception as e:
    print(f"FAILURE: Caught expected exception with OpenVINOExecutionProvider.")
    # Re-raising the exception to show the full traceback
    raise e

Relevant log output

2025-11-14 11:06:57.157833775 [E:onnxruntime:, inference_session.cc:2544 operator()] Exception during initialization: /onnxruntime/onnxruntime/core/providers/openvino/backend_manager.cc:185 onnxruntime::openvino_ep::BackendManager::BackendManager(onnxruntime::openvino_ep::SessionContext&, onnxruntime::openvino_ep::SharedContext&, const onnxruntime::Node&, const onnxruntime::GraphViewer&, const onnxruntime::logging::Logger&, onnxruntime::openvino_ep::EPCtxHandler&) /onnxruntime/onnxruntime/core/providers/openvino/backends/basic_backend.h:85 onnxruntime::openvino_ep::OnnxToOvNetworkBindings::OnnxToOvNetworkBindings(onnxruntime::openvino_ep::OVExeNetwork&, onnxruntime::openvino_ep::SubGraphContext&, onnxruntime::openvino_ep::SessionContext&)::<lambda(auto:48&, const onnxruntime::openvino_ep::SubGraphContext::string_index_map_t&, const auto:49&)> [with auto:48 = std::vector<onnxruntime::openvino_ep::ParameterInfo>; auto:49 = std::vector<ov::Output<const ov::Node> >; onnxruntime::openvino_ep::SubGraphContext::string_index_map_t = std::unordered_map<std::__cxx11::basic_string<char>, unsigned int>] matched_names was false. [OpenVINO-EP] Input names mismatch between OpenVINO and ONNX. /fused_featurizer/blocks.25/attn/Sqrt_2_output_0 doesn't exist in the list of OpenVINO input tensor names


Traceback (most recent call last):
  File "/home/pd/vla/openvla/openvla_onnx.py", line 348, in <module>
    onnx_action = run_onnx_inference(VISION_PATH, PROJECTOR_PATH, EMBEDDING_PATH, ONNX_PREFILL_PATH, ONNX_DECODER_PATH, inputs, config)
  File "/home/pd/vla/openvla/openvla_onnx.py", line 168, in run_onnx_inference
    onnx_generator = OnnxVLAGenerator(
  File "/home/pd/vla/openvla/openvla_onnx.py", line 18, in __init__
    self.ort_session_vision = ort.InferenceSession(vision_backbone_path, sess_options=session_options, providers=[provider])
  File "/home/pd/miniconda3/envs/openvla/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 473, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/home/pd/miniconda3/envs/openvla/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 572, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /onnxruntime/onnxruntime/core/providers/openvino/backend_manager.cc:185 onnxruntime::openvino_ep::BackendManager::BackendManager(onnxruntime::openvino_ep::SessionContext&, onnxruntime::openvino_ep::SharedContext&, const onnxruntime::Node&, const onnxruntime::GraphViewer&, const onnxruntime::logging::Logger&, onnxruntime::openvino_ep::EPCtxHandler&) /onnxruntime/onnxruntime/core/providers/openvino/backends/basic_backend.h:85 onnxruntime::openvino_ep::OnnxToOvNetworkBindings::OnnxToOvNetworkBindings(onnxruntime::openvino_ep::OVExeNetwork&, onnxruntime::openvino_ep::SubGraphContext&, onnxruntime::openvino_ep::SessionContext&)::<lambda(auto:48&, const onnxruntime::openvino_ep::SubGraphContext::string_index_map_t&, const auto:49&)> [with auto:48 = std::vector<onnxruntime::openvino_ep::ParameterInfo>; auto:49 = std::vector<ov::Output<const ov::Node> >; onnxruntime::openvino_ep::SubGraphContext::string_index_map_t = std::unordered_map<std::__cxx11::basic_string<char>, unsigned int>] matched_names was false. [OpenVINO-EP] Input names mismatch between OpenVINO and ONNX. /fused_featurizer/blocks.25/attn/Sqrt_2_output_0 doesn't exist in the list of OpenVINO input tensor names

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions