-
Notifications
You must be signed in to change notification settings - Fork 545
Labels
triagedIssue has been triaged by maintainersIssue has been triaged by maintainers
Description
By the moment setup.py script imports onxx_runtime only to extract version information. But this import leads to import of tensorrt, which requires cuda runtime libraries to be available. Thus makes it problematic to build docker images with onxx_runtime inside in CI/CD env without GPUs and cuda runtime.
Some refactoring is required to extract version from package without import of tensorrt, so build in CI/CD without GPU will be possible.
Log when tensorrt is not available on any command requiring setup.py (even not installing)
ERROR: Command errored out with exit status 1:
command: /home/dener/.pyenv/versions/eris-ct-worker/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/setup.py'"'"'; __file__='"'"'/home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-ywrl82zv
cwd: /home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/
Complete output (11 lines):
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/dener/.pyenv/versions/eris-ct-worker/src/onnx-tensorrt/setup.py", line 18, in <module>
import onnx_tensorrt
File "/home/dener/.pyenv/versions/3.8.5/envs/eris-ct-worker/src/onnx-tensorrt/onnx_tensorrt/__init__.py", line 23, in <module>
from . import backend
File "/home/dener/.pyenv/versions/3.8.5/envs/eris-ct-worker/src/onnx-tensorrt/onnx_tensorrt/backend.py", line 22, in <module>
from .tensorrt_engine import Engine
File "/home/dener/.pyenv/versions/3.8.5/envs/eris-ct-worker/src/onnx-tensorrt/onnx_tensorrt/tensorrt_engine.py", line 21, in <module>
import tensorrt as trt
ModuleNotFoundError: No module named 'tensorrt'
Log on the situation when there is no cuda runtime available:
ERROR: Command errored out with exit status 1:
command: /usr/bin/python3 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/setup.py'"'"'; __file__='"'"'/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-crge83fu
cwd: /tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/
Complete output (13 lines):
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/setup.py", line 18, in <module>
import onnx_tensorrt
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/onnx_tensorrt/__init__.py", line 23, in <module>
from . import backend
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/onnx_tensorrt/backend.py", line 22, in <module>
from .tensorrt_engine import Engine
File "/tmp/pip-install-rxt1j24m/onnx-tensorrt_12e18d44d1d04b1a975d60ce9c274f9c/onnx_tensorrt/tensorrt_engine.py", line 22, in <module>
import pycuda.driver
File "/usr/local/lib/python3.8/dist-packages/pycuda/driver.py", line 62, in <module>
from pycuda._driver import * # noqa
ImportError: libcuda.so.1: cannot open shared object file: No such file or directory
----------------------------------------
pgils, mhnaufal, drakenaver and nimdrak
Metadata
Metadata
Assignees
Labels
triagedIssue has been triaged by maintainersIssue has been triaged by maintainers