I’m on Arch (btw.) and I have a Intel i5-14600K CPU with a iGPU (UHD Graphics 770) (GPU 1) in it and a dGPU from Nvidia, the RTX 3060 (GPU 0). I have one monitor connected to the 3060 via display port 1.4.

I can see both GPUs in GNOME Mission Center, but hte iGPU has always Clock Speed 0 and Utilization 0. So anything which is done on the GPU is done on the 3060.

I want to seperate what is done on the iGPU and what is done on the 3060:

dGPU (RTX 3060):

  1. Video editing
  2. video transcoding
  3. AI stuff (ollama)
  4. Machine learning
  5. Blender
  6. Steam games

iGPU (intel):

  1. Firefox (especially YouTube video decoding, it has hw acceleration for that)
  2. Chrome
  3. Libre Office
  4. GNOME
  5. etc.

I wonder if this or at least parts of it is possible. I need the whole 12 GB VRAM on the 3060 for ollama, and the iGPU is just sitting there doing nothing. Is there a way to distribute the work? Do I need two screens for that or something?

It might also be that I’m misunderstanding how the whole thing works or over estimating Linuxes capabilities.

  • Jeena@piefed.jeena.netOP
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 days ago

    Thank you everyone for explaining your view of how it works on Linux, it seems to be very chaotic when it comes to this topic, everyone has their own understanding and some contradict each other. But it was great to get some feedback and food for thought which let me to try other things which I didn’t think of trying before.

    Now I finally have some things working on the iGPU because I plugged the monitor into the motherboard instead of the dGPU. It seems that the offloading works well still because things like ollama and video editing still are done on the powerfull dGPU and then somehow the results synchronized to the motherboard output and on the screen.

    When it comes to the iGPU I see that many applications run on both at the same time like for example Firefox or Chrome when I look at nvtop. Especially when I watch videos in the browser then the iGPU is mostly active. Also when I use GNOME Shell features like seeing the overview by pressing the Super key or switching between virtual desktops.

    With help of switcheroo-control I can now force a program like DarkTable to work specifically on the dGPU.

    So while I still don’t have full controll over it, I feel I’m utilizing both graphics cards now much better and that was what I was looking for before I wrote this question, so thanks a lot everyone!

  • mark@social.cool110.xyz
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    3 days ago

    @jeena As always the Arch wiki has you covered. PRIME, is the system you need. It’s slightly easier with the monitor plugged into the iGPU, (that’s how laptops are wired).

  • just_another_person@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    3 days ago

    I’m not sure what in the world these other answers are, but that is not how GPUs work if you’re talking about a desktop.

    If you’re talking about a laptop, this is not going to work in Linux.

    It’s hard to tell from your description if this is desktop or laptop, btw. Post the model of it is a laptop as this will be important.

      • just_another_person@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        3 days ago

        So desktops don’t work like laptops in this sense.

        On a laptop, the bus for the video output ports can be connected to one or both GPUs, and the software does the graphics switching or offloading.

        On a desktop, there is no consolidated bus between the PCIe card and the onboard graphics, so you can’t switch between which GPU is rendering what on hardware alone. It’s the whole display that is rendered on the device you’re plugged into.

        Windows does have some sort of offloading utility that allows for this i believe, but I’ve never used it so don’t know how well it works.

        On Linux, your display server (X or Wayland) needs to address one GPU at a time to render things.

        You can totally use both GPUs with multiple monitors, but I think that’s defeating the purpose you have in mind.

  • Gayhitler@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    You have some good answers and some bad answers here.

    It’s not the fault of the people answering, what you’re asking has been piecemeal and scattershot in implementation over the last decade so everyone has some bizarre response they came up with to be happy.

    Allow me to share mine: use a kvm switch.

    The switch lets you plug two computers into one keyboard, video, and mouse. But you’re gonna just use the video part. Plug it into both your motherboards and gpus video ports and push the button to switch back and forth between the gpu for gaming and the motherboard for everything else.

    Why only gaming? Because everything else you reference can make use of a gpu that’s not being used for video. I guess some game engines support rendering frames and then sending them to another output device but that’s not something to rely on.

    So when you’re using blender you see the model on your monitor plugged into the motherboard but the heavy lifting is done by the gpu. When you transcode a video the same thing happens.

    I came to this solution after trying to do what you’re asking for in x11 and having a bunch of headaches about it everytime an update would come down.

    Pushing a little button on the desktop was easier than messing around with software to make a rube Goldberg contraption to do the same thing. Mine had two leds on either side to indicate which “computer” I was using at the time. I ended up wrapping electrical tape around the rim to cover them both up and cut out the word “turbo” from the tape over the green led that indicated I was looking at the gpu.

    • Jeena@piefed.jeena.netOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      I did, but that only allows me to run it in the descrete one where it is already running as far as I understand.

      • anon5621@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 days ago

        okay maybe u bios u need choose gpu mode as 3d accelerator and how u were install nvidia driver also too,maybe u switched priority using nvidia-settings if so switch back to intel reboot pc and then use switcheroo-control ? I am using dual gpu and my main by default is igpu and it was by default from arch installation

  • Pogogunner@sopuli.xyz
    link
    fedilink
    arrow-up
    2
    arrow-down
    7
    ·
    3 days ago

    I’m going to assume that it is possible to put both the dedicated and integrated GPUs to work, though I’ve never seen this kind of setup.

    This is likely not something you want to actually do.

    The integrated GPU in your processor is not an additional bit of computing power your computer is not using, but special software that can use your processor to put out graphics if a dedicated GPU is missing. It is extremely inferior at processing graphics compared to the real dedicated GPU, and if you were running firefox to watch (Not decode) youtube, you would very likely see things like screen tearing as the processor struggled to keep up.

    If you wanted to do this just to see the outcome for yourself, you could switch your displayport cable to your monitor to connect to your motherboard instead of your GPU to get an idea of how rough this would be. If you wanted to continue after seeing this, I believe you would need to connect a 2nd monitor in order to use both the dedicated and integrated GPUs.

    • MentalEdge@sopuli.xyz
      link
      fedilink
      arrow-up
      13
      ·
      edit-2
      3 days ago

      The integrated GPU in your processor is not an additional bit of computing power your computer is not using, but special software that can use your processor to put out graphics if a dedicated GPU is missing. It is extremely inferior at processing graphics compared to the real dedicated GPU, and if you were running firefox to watch (Not decode) youtube, you would very likely see things like screen tearing as the processor struggled to keep up.

      This is straight up wrong. You are confusing GPUs with display adapters.

      iGPUs are an actual on-die GPU, consiting of their own hardware, present on the die in addition to the CPU.

      They can game. They can hardware decode and encode media, etc. They are full GPUs. Some are even quite powerful, though usually you’ll find them to be designed for everyday use and only light gaming.

      The GPU in every recent game console is technically an iGPU, same goes for phones, and the Steamdeck.

      They do not “translate” GPU instructions into running on the CPU cores.

      That’s software rendering, and is what CPUs do when there isn’t an iGPU at all. (Though they’ll still need a display adapter, which a GPU can act as. But a display adapter doesn’t need to be a full on GPU. And iGPUs aren’t just display adapters.)

    • mark@social.cool110.xyz
      link
      fedilink
      arrow-up
      5
      ·
      3 days ago

      @jeena @Pogogunner

      I’m going to assume that it is possible to put both the dedicated and integrated GPUs to work, though I’ve never seen this kind of setup.

      Every single laptop with a dGPU does that, as I’m typing this now only Minecraft is using the dGPU while everything else is on the iGPU. Everything is fully performant (including YT videos), and it greatly increases battery life.

      • Pogogunner@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        3 days ago

        I’ve never owned a laptop with a GPU, very interesting how that works.

        Is it something works out of the box, or do you have to manually set it up?

        • mark@social.cool110.xyz
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          @Pogogunner At least on Debian based distros, it’s all part of the driver installation.

          As for how it works at the hardware/kernel level the iGPU take some of system RAM to use as VRAM, so all the kernel has to do is give the dGPU a DMA buffer into that. The final piece is for the iGPU driver to send a synchronisation signal to the dGPU when it’s ready to receive the (partial-)frame.

    • Jeena@piefed.jeena.netOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      Yeah, not really, I think the iGPU is still very preformant, especially it has bread video de and encoding capabilities. It’s probably more powerful than my dell XPS 13 laptop iGPU which let’s me connect a external 4k monitor and run games and video editing smoothly with some Lowe settings.