Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vector Assertion Failure in InferenceSession Init with Hotplugged-Off Cores on ARM (v1.21.0) #24221

Open
PapperYZ opened this issue Mar 27, 2025 · 2 comments

Comments

@PapperYZ
Copy link

Describe the issue

Environment:

  • Device: OrangePi 5 Plus (4x A76 + 4x A55)
  • Config: Any cores disabled (e.g., echo 0 > /sys/devices/system/cpu/cpu[4-7]/online)
  • OS: Ubuntu 22.04
  • ONNX Runtime: v1.21.0
  • Python: 3.10
  • GCC: 14 (gcc-toolset-14)

Issue:

  • Crashes during InferenceSession init with
    /opt/rh/gcc-toolset-14/root/usr/include/c++/14/bits/stl_vector.h:1130: Assertion '__n < this->size()' failed.
  • Succeeds when all 8 cores are online (even if restricted to 4 via taskset -c 4-7) or on a 4x A53 system with no hotplugging.
  • Fails whenever any cores are hotplugged off (e.g., 4x A55 only or A76 + partial A55).

Observations:

  • os.cpu_count() correctly reports online cores (e.g., 4).
  • OMP_NUM_THREADS=1 doesn’t fix it—issue is in init, not runtime threading.
  • Suspect: Mishandles CPU topology when cores are hotplugged off (e.g., uses present instead of online).

To reproduce

disable 4 cores with command below

echo 1 | sudo tee /sys/devices/system/cpu/cpu[0-3]/online
echo 0 | sudo tee /sys/devices/system/cpu/cpu[4-7]/online

and run simple python script below:

import onnxruntime
model_path = "/path/to/resnet50_v1.onnx"
session = onnxruntime.InferenceSession(model_path, providers=["CPUExecutionProvider"])
print("Session initialized successfully")

You will see:

Image

Urgency

please help to solve the issue soon, it should be a quick fix... it is currently preventing us with a benchmark reference with our ASIC with similar cores.

Platform

Linux

OS Version

Ubuntu 22.04

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

ONNX Runtime: v1.21.0

ONNX Runtime API

Python

Architecture

ARM64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@snnn
Copy link
Member

snnn commented Mar 27, 2025

Could you please build ONNX Runtime in debug mode and generate a stacktrace?
Doc: https://onnxruntime.ai/docs/build/inferencing.html

@PapperYZ
Copy link
Author

Sorry, I do not know how to do that
I have already provided clear reproduce steps, would you try to look into that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants