I have this question. I see people, with some frequency, sugar coating the Nvidia GPU marriage with Linux. I get that if you already have a Nvidia GPU or you need CUDA or work with AI and want to use Linux that is possible. Nevertheless, this still a very questionable relationship.

Shouldn’t we be raising awareness about in case one plan to game titles that uses DX12? I mean 15% to 30% performance loss using Nvidia compared to Windows, over 5% to 15% and some times same performance or better using AMD isn’t something to be alerting others?

I know we wanna get more people on Linux, and NVIDIA’s getting better, but don’t we need some real talk about this? Or is there some secret plan to scare people away from Linux that I missed?

Am I misinformed? Is there some strong reason to buy a Nvidia GPU if your focus is gaming in Linux?

  • reagansrottencorpse@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    10 hours ago

    Im putting together my new Nvidia PC build tonight. I was planning on putting bazzite on it, should I just use windows then?

    • Turtle@aussie.zone
      link
      fedilink
      arrow-up
      1
      ·
      26 minutes ago

      Nvidia cards work just fine on Linux, old issues are parroted around by people who don’t know any better.

    • typhoon@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      7 hours ago

      I’d go with Linux, no natter what, but this seem exactly why I feel that we should be more clear. People may be building some PCs out there because others keep saying that everything is smooth sails with Nvidia. A lot of it is working now but there are some downsides and the recommendation is to go with AMD if you can.

    • typhoon@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      11 hours ago

      Thanks. That is what I thought but is good to confirm if we are not missing something.

  • Ardens@lemmy.ml
    link
    fedilink
    arrow-up
    21
    ·
    21 hours ago

    I use AMD, where ever it is possible. Simply because they support Linux. There’s really no other reason needed. I don’t care about CUDA or anything else, that is vaguely not relevant. I’d rather drive a medium car, that gives me freedom, than a high end car, that ties me down.

  • mybuttnolie@sopuli.xyz
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    1 day ago

    yes, HDMI 2.1. if you use a tv as a monitor, you won’t get 4k120 with amd cards on linux because hdmi forum is assholes

    • wonderfulvoltaire@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      22 hours ago

      I have a 6900xt and it has output for 4k 120 and I never had issues with it on multiple distros. Lately Bazzite has been behaving as expected so I don’t know where this information is coming from besides the argument that HDMI is closed source as opposed to DisplayPort.

      • mybuttnolie@sopuli.xyz
        link
        fedilink
        arrow-up
        8
        ·
        18 hours ago

        hdmi 2.0 doesn’t have the bandwidth for 4k120, displayport and hdmi 2.1 do. amd drivers don’t have hdmi 2.1 driver, because the hdmi forum didn’t allow amd to use it in their open source linux driver. you still get 4k120 with dp and even on hdmi if you use limited colorspace

  • data1701d (He/Him)@startrek.website
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    1
    ·
    1 day ago

    I feel like most people who use Nvidia on Linux just got their machine before they were Linux users, with a small subset for ML stuff.

    Honestly, I hear ROCm may finally be getting less horrible, is getting wider distro support, and supports more GPUs than it used to, so I really hope AMD will become as livable ML dev platform as it is a desktop GPU.

    • Pika@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      I fall into this category. Went Nvidia back in 16 when I built my gaming rig expecting that I would be using windows for awhile as gaming on Linux at that point wasn’t the greatest still, ended up deciding to try out a 5700xt (yea piss poor decision i know) a few years later because I wanted to future proof if I decided to swap to linux. The 5700XT had the worst reliability I’ve ever seen in a graphics card driver wise, and eventually got so sick of it that I ended up going back to Nvidia with a 4070. Since then my life opened up more so I had the time to swap to Linux on my gaming rig, and here we are.

      Technically I guess I could still put the 5700XT back in, and it would probably work better than being in my media server since Nvidia seems to have better isolation support in virtualized environments but, I haven’t bothered doing so, mostly because getting the current card to work on my rig was a pain, and I don’t feel like taking apart two machines to play hardware musical chairs.

    • nfreak@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      18 hours ago

      I completely upgraded my desktop like a month before I decided to make the switch. If I planned ahead just a bit more I would’ve gone with an AMD card for sure. This 4090 is still new enough that I can probably trade it in, but that’s such a pain in the ass.

      • warmaster@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 day ago

        I did this.

        From:

        Intel i7 14700K + 3080 TI

        To:

        Ryzen 7700X + RX 7900 XTX.

        The difference on Wayland is very big.

        • Nerdulous@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 hours ago

          Did you see any performance change because that setup seems pretty equivalent to me

          • warmaster@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            18 hours ago

            Absoluletly. All my issues just disappeared, performance went way higher and the smoothness is even very noticeable on the desktop. On top of that there are things like Steam Game Mode that only work on AMD because of their FOSS driver.

            NVIDIA has finally learned the lesson but they are a few years behind of AMD, it will take time for their FOSS driver to mature.

    • utopiah@lemmy.ml
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      Yep, that’d be me. That said if I were to buy a new GPU today (well, tomorrow, waiting on Valve announcement for its next HMD) I might still get an NVIDIA because even though I’m convinced 99% of LLM/GenAI is pure hype, if 1% might be useful, might be built ethically and might run on my hardware, I’d be annoyed if it wouldn’t because ROCm is just a tech demo but is too far performance wise. That’d say the percentage is so ridiculously low I’d probably pick the card which treats the open ecosystem best.

      • MalReynolds@piefed.social
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        23 hours ago

        ROCm works just fine on consumer cards for inferencing and is competetive or superior in $/Token/s and beats NVIDIA power consumption. ROCm 7.0 seems to be giving >2x uplift on consumer cards over 6.9, so that’s lovely. Haven’t tried 7 myself yet, waiting for the dust to settle, but I have no issues with image gen, text gen, image tagging, video scanning etc using containers and distroboxes on Bazzite with a 7800XT.

        Bleeding edge and research tends to be CUDA, but mainstream use cases are getting ported reasonably quickly. TLDR unless you’re training or researching (unlikely on consumer cards) AMD is fine and performant, plus you get stable linux and great gaming.

      • FauxLiving@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        I use local ai for speech/object recognition from my video security system and control over my HomeAssistant and media services. These services are isolated from the Internet for security reasons, that wouldn’t be possible if they required OpenAI to function.

        ChatGPT and Sora are just tech toys, but neural networks and machine learning are incredibly useful components. You would be well served by staying current on the technology as it develops.

      • data1701d (He/Him)@startrek.website
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        From what I’ve heard, ROCm may be finally getting out of its infancy; at the very least, I think by the time we get something useful, local, and ethical, it will be pretty well-developed.

        Honestly, though, I’m in the same boat as you and actively try to avoid most AI stuff on my laptop. The only “AI” thing I use is I occasionally do an image upscale. I find it kind of useless on photos, but it’s sometimes helpful when doing vector traces on bitmap graphics with flat colors; Inkscape’s results aren’t always good with lower resolution images, so putting that specific kind of graphic through “cartoon mode” upscales sometimes improves results dramatically for me.

        Of course, I don’t have GPU ML acceleration, so it just runs on the CPU; it’s a bit slow, but still less than 10 minutes.

  • megopie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    1
    ·
    1 day ago

    I’d say in general, the advantages of Nvidia cards are fairly niche even on windows. Like, multi frame generation (fake frames) and upscaling are kind of questionable in terms of value add most of the time, and most people probably aren’t going to be doing any ML stuff on their computer.

    AMD in general offers better performance for the money, and that’s doubly so with Nvidia’s lackluster Linux support. AMD has put the work in to get their hardware running well on Linux, both in terms of work from their own team and being collaborative with the open source community.

    I can see why some people would choose Nvidia cards, but I think, even on windows, a lot of people who buy them probably would have been better off with AMD. And outside of some fringe edge cases, there is no good reason to choose them when building or buying a computer you intend to mainly run Linux on.

    • filister@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      3
      ·
      1 day ago

      Even though I hate Nvidia, they have a couple of advantages:

      • CUDA
      • Productivity
      • Their cards retain higher resale values

      So if you need this card for productivity and not only gaming, Nvidia is probably better, if you buy second hand or strictly for gaming, AMD is better.

      • megopie@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        1 day ago

        It depends on the type of productivity TBH. Like, sure some productivity use cases need CUDA, but a lot of productivity use cases are just using the cards as graphics cards. The places where you need CUDA are real, but not ubiquitous.

        And “this is my personal computer I play games on, but also the computer I do work on, and that work needs CUDA specifically” is very much an edge case.

        • filister@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          24 hours ago

          As far as I am aware they are also better at video encoding and if you want to use Blender or similar software, yes, it is niche, but a credible consideration. As always, it really depends on the use case.

          • reliv3@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            23 hours ago

            Blender can be CUDA accelerated which does give Nvidia an edge over AMD. In terms of video encoding, both nvidia and AMD cards are AV1 capable, so they are on par for video encoding; unless a program does not support AV1, then the proprietary nvidia video encoders are better.

    • Mihies@programming.dev
      link
      fedilink
      arrow-up
      3
      arrow-down
      6
      ·
      edit-2
      23 hours ago

      From just hardware perspective, Nvidia cards are more energy efficient.

      Edit: I stand corrected, series 9070 is much more energy efficient.

      • iopq@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        1 day ago

        That’s not quite true. AMD cards just get clocked higher from the factory. So when a 9070xt beats a 5070 by an average of 17%, you can easily cap the power limit to match the performance. That’s with more VRAM which of course increases the power requirements

        The prices don’t quite match up, though since it’s between the 5070 and the ti (although in the US it’s often more expensive for some reason)

        The problem is that AMD is selling the chips to OEMs for a price that’s too high to enable to sell at MSRP while giving a discount for small batches of MSRP models. It becomes a lottery where the quickest people can get $600 models refreshing ever rarer restocks.

        One of the reasons is… tariffs, but I’m not sure how Nvidia got the prices down on its models

  • Admetus@sopuli.xyz
    link
    fedilink
    arrow-up
    3
    ·
    19 hours ago

    I only play older games, opensource games (like Pioneer Space Sim, Luanti), and emulate PS2 mostly (could do PS3/4 you bet) so AMD is fine for my use case and works out of the box. I know Nvidia Linux support has improved which means the latest graphics cards also pretty much work out of the box too. But by principle, I support AMD for the work they put into working on Linux.

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    15
    arrow-down
    2
    ·
    1 day ago

    I think the answer is if you are shooting for the high-end. AMD is better cost / performance but NVIDIA is still unchallenged for absolute performance if budget is not a consideration.

    And if you need CUDA…

    • typhoon@lemmy.worldOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 day ago

      I agree with that, because there is no offering from AMD to compete with the absolute performance.

  • just_another_person@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    1 day ago

    AMD will have superior support and better power management out of the box hands down.

    Nvidia may have a minor performance improvement in some areas depending on the card, but not in a way you would care if you aren’t obsessed with the technical specifics of the graphics on AAA games.

    I’ve been on Linux as a dev and daily driver for 20 years, and Nvidia drivers are just problematic unless you know exactly how to fix them when there are issues. That’s an Nvidia problem, not a Linux problem. Cuda on AMD is also a thing if you want to go that route.

    The choice is yours.

    • vinnymac@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      19 hours ago

      I’m glad you mentioned knowing how to fix them. My server has hosted Nvidia GPUs for 15 odd years now, working great, and has remained stable through updates by some miracle.

      Getting it set up was a nightmare back then though, do not recommend for the faint of heart.

  • Lukemaster69@lemmy.caB
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    1 day ago

    it is better to go with AMD because AMD drivers are built into the iso and less headache for gaming