Giter Site home page Giter Site logo

Error during inference - RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! about stable-diffusion-webui-tensorrt HOT 22 CLOSED

anwoflow avatar anwoflow commented on July 21, 2024 2
Error during inference - RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0!

from stable-diffusion-webui-tensorrt.

Comments (22)

contentis avatar contentis commented on July 21, 2024 3

SDXL currently required the dev branch from automatic1111 to enable the hooks. I updated the reader accordingly.

ControlNet currently isn't supported.

from stable-diffusion-webui-tensorrt.

contentis avatar contentis commented on July 21, 2024 1

Okay, this means the install has failed... This is one of the most common issues reported. We'll get a fix out as soon as possible. In the meantime this can be fixed manually by following: #27 (comment)

from stable-diffusion-webui-tensorrt.

ronschneider avatar ronschneider commented on July 21, 2024 1

I had same error, different cause, switching to dev branch fixed it.

git switch dev

from stable-diffusion-webui-tensorrt.

enbermudas avatar enbermudas commented on July 21, 2024 1

@enbermudas There are two errors discussed in this thread, which one are you referring to? I assume this one: ModuleNotFoundError: No module named 'tensorrt_bindings ?

The error from the actual issue:

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0!

from stable-diffusion-webui-tensorrt.

Alphyn-gunner avatar Alphyn-gunner commented on July 21, 2024

Are you sure you're not using --medvram? I think someone mentioned on Reddit that TensorRT thins that offloading parts on the model is the same as using several CUDA devices.

from stable-diffusion-webui-tensorrt.

contentis avatar contentis commented on July 21, 2024

What SD version have you been using (1,2,XL)?

from stable-diffusion-webui-tensorrt.

zopieux avatar zopieux commented on July 21, 2024

You're probably using additional stuff like a ControlNet. I just experienced that error with the default webui ControlNet extension on a SD1.5 checkpoint. I'm a complete SD newbie but I can only suppose TensorRT would need a prebuilt engine to work on a given ControlNet model or even input image.

@contentis I'm new to this, but given the stack trace, it looks like ControlNet is doing something LoRA-related, and as per #20 that would require building an "engine" for it:

  File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/stable-diffusion-webui/extensions-builtin/Lora/networks.py", line 429, in network_Linear_forward
    return originals.Linear_forward(self, input)
  File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/nn/modules/linear.py", line 114, in forward
    return F.linear(input, self.weight, self.bias)
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument mat1 in method wrapper_CUDA_addmm)

Perhaps the actual feature request here is to be able to build TensorRT engines per combination of {checkpoint model, ControlNet model (OpenPose, Canny, etc.)}. Hopefully this doesn't need to be prebuilt per input ControlNet image, but I have no idea what I'm talking about.

from stable-diffusion-webui-tensorrt.

FurkanGozukara avatar FurkanGozukara commented on July 21, 2024

sd 1.5 working
i tested on realistic vision
sdxl is giving this error

from stable-diffusion-webui-tensorrt.

camoody1 avatar camoody1 commented on July 21, 2024

Are you sure you're not using --medvram? I think someone mentioned on Reddit that TensorRT thins that offloading parts on the model is the same as using several CUDA devices.

Thank you. I had to remove --medvram from my .bat file to get the SD1.5 engine to work. That's frustrating.

from stable-diffusion-webui-tensorrt.

anwoflow avatar anwoflow commented on July 21, 2024

To answer some questions
A1111 version - 1.6
Yes, it was SDXL, so that might be the issue, I'm not on the dev branch.
No, I wasn't using ControlNet.

I've removed the extension and installed it again - have new errors now.

from stable-diffusion-webui-tensorrt.

contentis avatar contentis commented on July 21, 2024

I've removed the extension and installed it again - have new errors now.

Do you have an error log to share? Working on some fixes

from stable-diffusion-webui-tensorrt.

anwoflow avatar anwoflow commented on July 21, 2024

I have this issue now - #27 (comment)

followed the steps, but no success so far, even the TensorRT tab disappeared from UI, although still visible in the Extensions tab. Might try with a fresh install later

from stable-diffusion-webui-tensorrt.

contentis avatar contentis commented on July 21, 2024

When launching webui is there an error in the shell? If the TensorRT tab does not appear in the UI it could hint at an issue during initialization.

from stable-diffusion-webui-tensorrt.

anwoflow avatar anwoflow commented on July 21, 2024

==============================================================================================================
Python 3.11.5 (tags/v3.11.5:cce6ba9, Aug 24 2023, 14:38:34) [MSC v.1936 64 bit (AMD64)]
Version: v1.6.0
Commit hash: 5ef669de080814067961f28357256e8fe27544f4
Requirement already satisfied: protobuf==3.20.2 in .\venv\lib\site-packages (3.20.2)
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: onnx-graphsurgeon in .\venv\lib\site-packages (0.3.27)
Requirement already satisfied: numpy in .\venv\lib\site-packages (from onnx-graphsurgeon) (1.23.5)
Requirement already satisfied: onnx in .\venv\lib\site-packages (from onnx-graphsurgeon) (1.14.1)
Requirement already satisfied: protobuf>=3.20.2 in .\venv\lib\site-packages (from onnx->onnx-graphsurgeon) (3.20.2)
Requirement already satisfied: typing-extensions>=3.6.2.1 in .\venv\lib\site-packages (from onnx->onnx-graphsurgeon) (4.8.0)
GS is not installed! Installing...
Installing protobuf
Installing onnx-graphsurgeon
UI Config not initialized
Launching Web UI with arguments: --xformers
*** Error loading script: trt.py
Traceback (most recent call last):
File "D:\sd10\stable-diffusion-webui\modules\scripts.py", line 382, in load_scripts
script_module = script_loading.load_module(scriptfile.path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\sd10\stable-diffusion-webui\modules\script_loading.py", line 10, in load_module
module_spec.loader.exec_module(module)
File "", line 940, in exec_module
File "", line 241, in call_with_frames_removed
File "D:\sd10\stable-diffusion-webui\extensions\Stable-Diffusion-WebUI-TensorRT\scripts\trt.py", line 10, in
import ui_trt
File "D:\sd10\stable-diffusion-webui\extensions\Stable-Diffusion-WebUI-TensorRT\ui_trt.py", line 10, in
from exporter import export_onnx, export_trt
File "D:\sd10\stable-diffusion-webui\extensions\Stable-Diffusion-WebUI-TensorRT\exporter.py", line 10, in
from utilities import Engine
File "D:\sd10\stable-diffusion-webui\extensions\Stable-Diffusion-WebUI-TensorRT\utilities.py", line 32, in
import tensorrt as trt
File "d:\sd10\stable-diffusion-webui\venv\Lib\site-packages\tensorrt_init
.py", line 18, in
from tensorrt_bindings import *
ModuleNotFoundError: No module named 'tensorrt_bindings'

That's what I get, and looks like it's the tensorrt_bindings.

from stable-diffusion-webui-tensorrt.

enbermudas avatar enbermudas commented on July 21, 2024

Exact error, but I'm not using SDXL, I'm using 1.5 Both dev and master branches fail.

from stable-diffusion-webui-tensorrt.

contentis avatar contentis commented on July 21, 2024

@enbermudas There are two errors discussed in this thread, which one are you referring to? I assume this one: ModuleNotFoundError: No module named 'tensorrt_bindings ?

from stable-diffusion-webui-tensorrt.

contentis avatar contentis commented on July 21, 2024

@enbermudas, On which model are you encountering this issue?
As mentioned before, note that SDXL currently requires the dev branch of auto1111.

from stable-diffusion-webui-tensorrt.

syaefullahkamal avatar syaefullahkamal commented on July 21, 2024

I have same issue, now working after removed some Safetensor from folder

from stable-diffusion-webui-tensorrt.

thanayut1750 avatar thanayut1750 commented on July 21, 2024

I have same issue, now working after removed some Safetensor from folder

How?

from stable-diffusion-webui-tensorrt.

aokocax avatar aokocax commented on July 21, 2024

I had same error, different cause, switching to dev branch fixed it.

git switch dev

Thank you, the problem is solved and I can use it with XL Models.

from stable-diffusion-webui-tensorrt.

bigmover avatar bigmover commented on July 21, 2024

SDXL currently required the dev branch from automatic1111 to enable the hooks. I updated the reader accordingly.

ControlNet currently isn't supported.

hi guys! your work is awesome!Controlnet is planned to supported? What time is it?

from stable-diffusion-webui-tensorrt.

dislive avatar dislive commented on July 21, 2024

same
v1.8.0  •  python: 3.11.8  •  torch: 2.1.2+cu121  •  xformers: 0.0.23.post1  •  gradio: 3.41.2  •  checkpoint: 70525c199b

from stable-diffusion-webui-tensorrt.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.