Sorry if I’m not the first to bring this up. It seems like a simple enough solution.

  • 👁️👄👁️@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Just like Chrome will stop being anti-consumer when people stop using it. Or Blizzard will stop being terrible if people stop buying their games. People are not very good at this whole “voting with your wallet” thing.

    • ripcord@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      The browser one is especially bad since there are plenty of good options and they all cost nothing except the most minal amount of time to switch to

    • dumdum666@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      They are also not very good in voting for politicians that actually act in their interest. It baffles me every day… what do you guys think is the reason for this?

      • coyotino [he/him]@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        undereducation. The missing skill here is critical thinking, and critical thinking is something that you don’t usually get a lot of practice with until college. The conservative strategy of raising the price of college, refusing to spend money on student aid, and demonizing college professors as liberal brainwashers has been quite effective in keeping their constituents away from higher education.

        • 𝒍𝒆𝒎𝒂𝒏𝒏@lemmy.one
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          I think quality of education is a big one too, but as long as teachers are underpaid, schools underfunded, understaffed and stretched as far as they can go, things can’t improve ☹️

          • coyotino [he/him]@beehaw.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            I agree completely! Underfunding of public schools is all part of the plan. Congressional Republicans get to send their kids to private school while their impoverished constituents are forced to send their kids to public schools that are literally falling apart. Most of those kids learn to hate school, so they don’t go to college. The cycle repeats.

    • greenskye@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Almost like voting with your wallet doesn’t actually work. Or only works in same way ‘communism’ and ‘well regulated free market capitalism’ concepts work… in theory only.

      • agent_flounder@lemmy.one
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        It works a lot better when there are many choices, fair competition in the market, and the traits being voted on are painfully obvious.

        • ThePenitentOne@discuss.online
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Because the free market is bullshit. It always results in a few major companies hand-shaking and fucking over consumers. Smaller businesses almost never have a chance and are just as easily bought out. To win in this capitalist iteration of society, you have to be the worst and greediest you can be. Add in the fact most people prefer to remain ignorant or are just generally apathetic from years of conditioning, and ‘voting with your wallet’ rarely really works. You should still do it though of course.

      • averyminya@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        It’s a struggle. Boycotts are historically very hard to be effective and I feel that the Internet has made it even more difficult. Protests need their own marketing and companies at an international scale feel almost immune from any public movements.

        That said, voting with your wallet, like boycotts, do work. They just need people to be consistent and informed. But it does work.

        Look at Star Wars Battlefront 2 (the 2nd). Prerelease it got over -600k downvotes and substantially hurt the game to the point that they reworked the entire system. If gamers had just bought the game and played anyway, EA wouldn’t have needed to actually rework it. But they were so worried about the performance of the game that they actually made a change.

        Same for Sonic the Hedgehog. He looked so, so terrible that the fear of losing money made him get fixed.

        Granted, these two are examples of something becoming changed before full release, but in spirit the effect is the same. Corporation scared to lose money so changes are made to help make money. Voting with your wallet does work. It just needs to be marketed right. Edit: and I completely forgot the context here, which is that for something like tech, while consumers can have a choice, corporations do too. That’s where the struggle comes in

    • agent_flounder@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Well… I bought an AMD card, I have been using Firefox for a few years now, and I’m not buying anything from Blizzard. There are literally dozens like me… Unfortunately, only a small number of people know these things and have these views and care enough to boycott. These companies will continue to do what they do until there is sufficient pushback (if ever) to make it less profitable than alternatives.

    • Turun@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      No, actually I don’t need to buy the worse product. Privacy considerations are part of the package, just like price and performance are.

      I use firefox, because in the performance - privacy - price consideration it beats chrome.

      I have a Nvidia graphics card, because being able to run CUDA applications at home beats AMD.

      • 👁️👄👁️@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        You don’t have to do anything, but you’re still encouraging this behavior no matter how you choose to look at it. If that doesn’t bother you, then idk why you’re even replying.

        • Turun@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Simply to provide an alternative perspective.

          I do choose the inferior product in some categories, so I don’t disagree with the above comment in general.

    • metaStatic@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      The good thing about voting with your wallet is that people with more money get more votes, the way god intended.

      • TwilightVulpine@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Almost like people with more money than sense can outvote everyone else.

        How do you even count “people who didn’t buy product X”? There could be millions more, either out of revolt or sheer disinterest, but that just doesn’t matter for the companies selling a product. The only votes that end up counting are the ones from people buying.

        People really need to drop that saying, because the market was never a democracy and it will never be. Hell, companies can even ignore the paying customers to do something else entirely because the ones who have the most money are the investors.

        • Bizarroland@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          1 year ago

          What’s funny is that I vote with my wallet, and I tell my friends about it and they think I’m the weird one for not having a Facebook account, not having insta or Twitter, or shopping at Amazon or Walmart or Chick-fil-A.

          Then I explain it and they say, “that makes sense” and not 30 minutes later are telling me about how I should look up somebody on tiktok, which I don’t have, or asking about windows 11, which I don’t use, or telling me I should buy a Tesla, which I don’t want, and its for all the same reasons I keep explaining to them.

          You vote with your wallet. My vote goes for people over countries and corporations.

          As a side effect, countries and corporations have ensured that anyone who doesn’t comply gets ostracized.

    • OrangeJoe@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Or maybe they are and you just don’t like how they are voting.

      I’m not saying that’s actually the case, but that point of view always seems to be absent from these types of discussions.

    • cobra89@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Not getting enough sales, gotta jack up the price so they make the same amount of money. Seems legit.

  • JokeDeity@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I mean, you could also say they’ll stop price gouging when competitors can meet their quality and support level. What’s the alternative?

    • potustheplant@feddit.nl
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      The reason nvidia has the r&d budget it has is because you buy their cards. AMD is just now matching them on that but they used to have about half the resources.

      • JokeDeity@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I mean, I buy my cards second hand because I’m dirt poor, but I still want the best option for my money. It’s a hard sell to convince people to buy inferior products to expand the market.

        • potustheplant@feddit.nl
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Except that most of nvidia’s features are borderline gimmicks or will not have a long lifespan. DLSS will eventually die and we’ll all use FSR just like it happened with gsync vs freesync. Raytracing is very taxing still for all cards in all brands to be worth it. RTX voice (just like AMD noise supression) is not something that useful (I don’t even want it). What else do you have that could make an nvidia card “superior”? The only thing I can think of is that the most recent nvidia cards are way more energy efficient than amd’s.

          • Sina@beehaw.org
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            1 year ago

            The only thing I can think of is that the most recent nvidia cards are way more energy efficient than amd’s.

            I think power draw is so close to even right now, that it’s not worth talking about.

            edit: yep, I was wrong.

            • potustheplant@feddit.nl
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              I just checked and it’s not thaaat close but yeah, it’s not a deal braker. From what I saw, the 7800xt consumes about 60w more than a 4070 so nvidia’s certainly better.

              • Sina@beehaw.org
                link
                fedilink
                arrow-up
                0
                ·
                1 year ago

                You are right, I checked some power draw benchmarks & it draws 45-50W more in regular gaming, it’s quite shocking ~

    • Sina@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      AMD is a lot better than before both in terms of hardware and software, far better in fact. For people that don’t buy the top of the line card every other year AMD is a real alternative, more so than in a long time.

      • SenorBolsa@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        I love my 6900XTH, killer chip. if you don’t expect ray tracing it’s an absolute monster. I bought it because it was what was available on the shelf but ultimately I feel like it was the best choice for me. I don’t think I’d buy another nvidia card for a while with the shit they’ve pulled, and I’d previously bought dozens of EVGA nvidia cards.

        I just wish FSR2 could be improved to reduce ghosting. it’s already OK so any improvement would make it very good.

  • Comment105@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    All of Lemmy and all of Reddit could comply with this without it making a difference.

    And the last card I bought was a 1060, a lot of us are already basically doing this.

    You have not successfully unionized gaming hardware customers with this post.

    • NattyNatty2x4@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Buddy all of reddit is hundreds of millions of people each month. If even a small fraction of them build their own PCs, they’d have a massive impact on nVidia sales

            • NattyNatty2x4@beehaw.org
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              I wasn’t able to find something outlining just the sales of the 4000 series cards, but the first graphic of this link at least has their total desktop GPU sales, which comes out to 30.34 million in 2022. Let’s put a fraction of hundreds of millions at 5% of 200 million to be generous to your argument. That’s 10 million. Then let’s say these people upgrade their GPUs once every 3 years, which is shorter than the average person, but the average person also isn’t buying top-of-the-line GPUs. So 3.33 million. 3.33/30.34 is 10.9% of sales.

              So even when we’re looking at their total sales and not specifically just the current gen, assume 200 million reddit users a month when it’s closer to 300, and assume the people willing to shell out thousands of dollars for the best GPU aren’t doing so every single time a new generation comes out, we’re still at 11% of their sales.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    What other company besides AMD makes GPUs, and what other company makes GPUs that are supported by machine learning programs?

    • Erdrick@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I jumped to team red this build.
      I have been very happy with my 7900XTX.
      4K max settings / FPS on every game I’ve thrown at it.
      I don’t play the latest games, so I guess I could hit a wall if I play the recent AAA releases, but many times they simply don’t interest me.

    • PlatinumSf@pawb.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      No joke, probably intel. The cards won’t hold a candle to a 4090 but they’re actually pretty decent for both gaming and ML tasks. AMD definitely needs to speed up the timeline on their new ML api tho.

      • JoeCoT@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Problem with Intel cards is that they’re a relatively recent release, and not very popular yet. It’s going to be a while before games optimize for them.

        For example, the ARC cards aren’t supported for Starfield. Like they might run but not as well as they could if Starfield had optimized for them too. But the card’s only been out a year.

        • Luna@lemmy.catgirl.biz
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          The more people use Arc the quicker it becomes mainstream and optimised for but arc is still considered “beta” and slow in peoples minds even though there were huge improvements and the old benchmarks don’t hold any value anymore. chicken and Egg problem. :/

          Disclaimer: i have an arc 770 16GB because every other sensible upgrade path would have cost 3x-4x more for the same performance uplift (and I’m not buying an 8GB card in 2023+) but now I’m starting to get really angry at people blaming Intel for “not supporting this new game” - all that gpus should support is the graphics API to the letter of the specification, all this day-1 patching and driver hotfixes to make games run decent is bs. Games need to feed the API and GPUs need to process what the API tells it to, nothing more nothing less. It’s a complex issue and i think Nvidia held the monopoly for too long, everything is optimised for Nvidia at the cost of making it worse for everyone else.

          • dan@upvote.au
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            Isn’t the entire point of DirectX and OpenGL that it abstracts away the GPU-specific details? You write code once and it works on any graphics card that supports the standard? It sounds like games are moving towards what we had in the old days, where they have specific code per graphics card?

            • Luna@lemmy.catgirl.biz
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              I think the issue started with gpu-architecture tailored technologies like physx or gameworks but im probably wrong. For example I have nothing against physx but it only runs on nvidia cores natively (fast), i have an issue when there’s a monetary incentive or exclusive partnering of nvidia and game studios - so if you want to play the game with all the features, bells and whistles, it was designed with you would need to also buy their overpriced (and current gen: underperforming) gpus just because you’d be missing out on features or performance on any other gpu architecture.

              If this trend continues everybody will need a €1k+ gpu from nvidia and a €1k+ gpu from AMD and hot-swap between them depending on what game you wish to play.

    • coffeetest@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      My Intel Arc 750 works quite well at 1080 and is perfectly sufficient for me. If people need hyper refresh rates and resolution and all all the bells well then have fun paying for it. But if you need functional, competent gaming, at US$200 Arc is nice.

      • jon@lemmy.tf
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        AMD has ROCm which tries to get close. I’ve been able to get some CUDA applications running on a 6700xt, although they are noticeably slower than running on a comparable NVidia card. Maybe we’ll see more projects adding native ROCm support now that AMD is trying to cater to the enterprise market.

        • Turun@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          They kinda have that, yes. But it was not supported on windows until this year and is in general not officially supported on consumer graphics cards.

          Still hoping it will improve, because AMD ships with more VRAM at the same price point, but ROCm feels kinda half assed when looking at the official support investment by AMD.

        • meteokr@community.adiquaints.moe
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          I don’t own any nvidia hardware out of principal, but ROCm is no where even close to cuda as far as mindshare goes. At this point I rather just have a cuda->rocm shim I can use, in the same was as directx->vulkan does with proton. Trying to fight for mindshare sucks, so trying to get every dev to support it just feel like a massive uphill battle.

    • Dudewitbow@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      AMD supports ML, its just a lot of smaller projects are made with CUDA backends, and dont have developers there to switch from CUDA to OpenCL or similar.

      Some of the major ML libraries that used to built around CUDA like Tensorflow has already made non CUDA branches, but thats only because tensorflow is open source, ubiquitous in the scene and litterally has google behind it.

      ML for more niche uses basically is in the chicken and egg situation. People wont use other gpus for ML because theres no dev working on non CUDA backends. No ones working on non CUDA backends because the devs end up buying Nvidia, which is basically what Nvidia wants.

      There are a bunch of followers but a lack in of leaders to move the direction in a more open compute environment.

      • PlatinumSf@pawb.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Huh, my bad. I was operating off of old information. They’ve actually already released the sdk and apis I was referring to.

  • Psythik@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Well when AMD finally catches up with nVidia and offers a high-end GPU with FG and decent Ray Tracing, I’ll gladly switch. I’d love nothing more than to have an all-AMD PC.

    • bleistift2@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      There are also games that don’t render a square mile of a city in photorealistic quality.

      • Valdair@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Graphical fidelity has not materially improved since the days of Crysis 1, 16 years ago. The only two meaningful changes for how difficult games should be to run in that time are that 1440p & 2160p have become more common, and raytracing. But consoles being content to run at dynamic resolutions and 30fps combined with tools developed to make raytracting palatable (DLSS) have made developers complacent to have their games run like absolute garbage even on mid spec hardware that should have no trouble running 1080p/60fps.

        Destiny 2 was famously well optimized at launch. I was running an easy 1440p/120fps in pretty much all scenarios maxed out on a 1080 Ti. The more new zones come out, the worse performance seems to be in each, even though I now have a 3090.

        I am loving BG3 but the entire city in act 3 can barely run 40fps on a 3090, and it is not an especially gorgeous looking game. The only thing I can really imagine is that maxed out the character models and armor models do look quite nice. But a lot of environment art is extremely low-poly. I should not have to turn on DLSS to get playable framerates in a game like this with a Titan class card.

        Nvidia and AMD just keep cranking the power on the cards, they’re now 3+ slot behemoths to deal with all the heat, which also means cranking the price. They also seem to think 30fps is acceptable, which it just… is not. Especially not in first person games.

        • SenorBolsa@beehaw.org
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          1 year ago

          Graphical fidelity has not materially improved since the days of Crysis 1

          I think you may have rose tinted glasses on this point, the level of detail in environments and accuracy of shading, especially of dynamic objects, has increased greatly. Material shading has also gotten insanely good compared to what we had then. Just peep the PBR materials on guns in modern FPS games, it’s incredible, Crysis just had normals and specular maps all black or grey guns that are kinda shiny and normal mapped. If you went inside of a small building or whatever there was hardly any shading or shadows to make it look right either.

          Crysis is a very clever use of what was available to make it look good, but we can do a hell of a lot better now (without raytracing) At the time shaders were getting really computationally cheap to implement so those still look relatively good, but geometry and framebuffer size just did not keep pace at all, tesselation was the next hotness after that because it was supposed to help fix the limited geometry horsepower contemporary cards had by utilizing their extremely powerful shader cores to do some of the heavy lifting. Just look at the rocks in Crysis compared to the foliage and it’s really obvious this was the case. Bad Company 2 is another good example of good shaders with really crushingly limited geometry though there are clever workarounds there to make it look pretty good still.

          I could see the argument that the juice isn’t worth the squeeze to you, but graphics have very noticeably advanced in that time.

        • Thrashy@beehaw.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          W/r/t Baldur’s Gate 3, I don’t think the bottleneck is the GPU. Act 3 is incredibly ambitious in terms of NPC density, and AI is one of those things that’s still very hard to parallelize.

      • metaStatic@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I’m currently part of the problem and this is so fucking true. Games have really stopped pushing the envelope because they either have to be cross platform compatible or they’re not even PC first.

        3D mark is the only thing I could find to put a dent in my 3060ti

    • senoro@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Why would datacenters be buying consumer grade cards? Nvidia has the A series cards for enterprise that are basically identical to consumer ones but with features useful for enterprise unlocked.

      • howrar@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I think you mean their Tesla line of cards? The A (e.g. A100) stands for the generation name (e.g. Ada or Ampere, don’t remember which one got the A), and that same name applies to both the consumer line (GeForce and Quadro) and the data’s centre cards.

        The hardware isn’t identical either. I don’t know all the differences, but I know at least that the data centre cards have SXM connectors that greatly increase data throughput.

  • Jordan Lund@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    As a console gamer, I don’t have to worry about it. Xbox Series X, PS5, Steam Deck are all AMD based. The Switch is Nvidia, but honestly, I can’t remember the last time I turned the Switch on.

  • raptir@lemdro.id
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’m looking at buying a gaming laptop, I have yet to find anything worth buying with an AMD card.

  • mateomaui@reddthat.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    well that won’t happen because they are still the best option for compatibility unless you’re using linux

    • Turun@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Works great for me. I installed the Nvidia package and everything simply works, and the driver is automatically updated when I do a system upgrade.

      And AMD still doesn’t have a solid answer to CUDA on consumer GPUs, as far as I know.

      Edit: works great for me on linux

      • mateomaui@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        oh don’t get me wrong, when nvidia is an option for linux it seems to work ok, while maybe an older driver, but some distros are a pain to get the nvidia driver installed, or are designed around AMD like ChimeraOS. Not sure if you can still add nvidia to that distro, I haven’t tried yet.

        • Turun@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Ok, maybe don’t use an os that is designed around AMD if you have an Nvidia GPU.

          I used Pop!_OS, Ubuntu and arch (current os) and it worked great on every single one. I did a downgrade on arch three times now (average once every 10 months or so), but to be frank I did the same for other software, that’s more an arch thing than a Nvidia thing.

          It’s also the most up to date driver, at least on arch.

          • mateomaui@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            yeah, no shit, captain obvious

            There’s also Linux Mint and ZorinOS to name others that have good built-in nvidia support.

            The point of my comments was to highlight how linux doesn’t universally work well with nvidia unless you get a distro that’s more compatible or user friendly with nvidia drivers. I mentioned ChimeraOS solely as an example of one that openly says it doesn’t support nvidia, even though it’s possible you may be able to install it separately.

            Your comments have confirmed what I said: that nvidia generally has the best compatibility [with games, emulators, etc] compared to AMD, unless you’re on linux, at which point you have to go to specific distros or go through the PITA process of making it work, when AMD generally just works.

            So the suggestion that no one should buy nvidia until they drop prices is simply DOA on arrival, because nvidia is still the most compatible, and the linux market share where it might be a problem is not that big.

            • Turun@feddit.de
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              at which point you have to go to specific distros or go through the PITA process of making it work, when AMD generally just works.

              Ok, I agree with this point.

              My counterargument is that those “specific distros” make up the vast majority of desktop Linux use. So it’s less that you have to choose a specific distro and more that you have to avoid niche distros.
              Doesn’t invalidate the core of your argument though.

              • mateomaui@reddthat.com
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                1 year ago

                I don’t even understand the pushback.

                I’m not shitting on nVidia or linux.

                I’m just pointing out the well-known compatibility issues that are evident if you spend any amount of time browsing a linux support channel, which would be the only solid argument for not buying nvidia cards en masse aside from pricing (or if you wanted to build a hackintosh), if the linux userbase was significant enough or if there weren’t other distros to choose from.

                Otherwise the vast majority of compatibility issues I see for pc gaming or emulation is in regards to AMD cards, so I wouldn’t bother buying one of those, no matter how much more affordable they might be. Just not worth the trouble when nVidia generally works as expected, or driver fixes are delivered faster.

                edit: unless it’s a game bafflingly designed around AMD, like Starfield apparently

  • geosoco@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    In this particular case, it’s a bit more complicated.

    I suspect the majority of 30x0 & 40x0 card sales continue to be for non-gaming or hybrid uses. I suspect that if pure gamers stopped buying them today for months, it wouldn’t make much of a difference to their bottom line.

    Until there’s reasonable competition for training AI models at reasonable prices, people are going to continue buying their cards because it’s the most cost-effective thing – even at the outrageous prices.

  • ono@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Both brands are still doing it, so I’m still not buying.

    Sigh… maybe next year.