• xerazal@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Amd’s epyc server cpus would be like 64 Machamp. Mf is huge and requires a hell of a cooler. See them at the datacenter I work at and when I opened the server up I thought I was looking at a turbocharged car engine or something.

      • Hot Saucerman@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        That’s very true, but perhaps I should have specified this is a very, very old meme (thus why we have come a long way). Probably 10-15 years old? Back when AMD really was struggling with performance issues, before they came back with the Ryzen series. Epyc servers are only like six years old, IIRC.

      • Hexarei@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        It used to be for a while that i3 was dual core with hyper threading, where the i5 was quad core with no hyper threading, and the i7 was quad core with HT.

  • remotelove@lemmy.ca
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    I am super happy with my 7950X3D. However, their GPU drivers still need some work for the 7900XTX.

    • Hot Saucerman@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I used to have lots of driver crashing and weirdness on my RX 580, but I’ve had mostly smooth sailing with my 6600 XT.

      • remotelove@lemmy.ca
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 year ago

        To be honest, I only get the driver crash at the absolute worst times now. After I did the switch to AMD from Intel and Nvidia, I did do a fresh windows install and have only had to reinstall the AMD graphics drivers about 4 times in the last couple of months. (While true, the last paragraph is not as bad as it sounds. Annoying, yes. End of the world, no.)

        There is a pattern to the madness though. If I go from gaming to other GPU intensive apps used across different screens, it’s probably going to hang the driver. Not fatally, but I reboot anyway when it happens.

        AMD is on the right track though. I think I have been through three different GPU drivers versions since I built the system and it is slowly getting better. I get a driver crash about once a week instead of once a day now.

        • HerrBeter@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I’ve never had amd drivers crash during normal usage, 6700xt water block. Microsoft sleep mode wrecks my pc and makes it instantly crash though.

          • remotelove@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Sleep mode is rough, for sure. It’s also one of the reasons why I did a completely fresh installation of Windows. (Sleep mode was suicide.) Also, I had heard an obscure rumor that AMD chipset drivers could be picky for old windows installations. (Like, not enabling the 3D cache on the CPU kind of picky.)

            But yeah, you aren’t alone with the sleep mode woes.

        • xerazal@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          It’s possible your gpu voltages are too high, aka unstable, even at stock. I was having similar problems with the 6800xt, although they were rare. Undervolted it with MorePowerTool, and haven’t had any issues since.

  • LetMeEatCake@lemm.ee
    link
    fedilink
    arrow-up
    127
    ·
    1 year ago

    GPU prices being affordable is definitely not a priority of AMD’s. They price everything to be barely competitive with the Nvidia equivalent. 10-15% cheaper for comparable raster performance but far worse RT performance and no DLSS.

    Which is odd because back when AMD was in a similar performance deficit on the CPU front (Zen 1, Zen+, and Zen 2), AMD had absolutely no qualms or (public) reservations about pricing their CPUs where they needed to be. They were the value kings on that front, which is exactly what they needed to be at the time. They need that with GPUs and just refuse to go there. They follow Nvidia’s pricing lead.

      • LetMeEatCake@lemm.ee
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        I agree, it’s just strange from a business perspective too. Obviously the people in charge of AMD feel that this is the correct course of action, but they’ve been losing ground for years and years in the GPU space. At least as an outside observer this approach is not serving them well for GPU. Pricing more aggressively today will hurt their margins temporarily but with such a mindshare dominated market they need to start to grow their marketshare early. They need people to use their shit and realize it’s fine. They did it with CPUs…

      • justsomeguy345@lemmy.ml
        link
        fedilink
        arrow-up
        32
        ·
        1 year ago

        something many people overlook is how intertwined nvidia, intel and amd are. not only does the personnel routinely switch between those companies but they also have the same top share holders. there’s no natural competition between them. it’s like a choreograhped light saber fight where all of them are swinging but none seem to have any intention to hit flesh. a show to make sure nobody says the m word.

    • bassomitron@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      1 year ago

      100%. Outside of brand loyalty, I just simply don’t see any reason to buy AMD’s higher tier GPUs over Nvidia right now. And that’s coming from a long, long time AMD fan.

      Sure, their raster performance is comparable at times, but almost never actually beats out similar tiers from Nvidia. And regardless, DLSS virtually nullifies that, especially since the vast majority of games for the last 4 years or so now support it. So I genuinely don’t understand AMD trying to price similarly to Nvidia. Their high end cards are inferior in almost every objective metric that matters to the majority of users, yet still ask for $1k for their flagship GPU.

      Sorry for the tangent, I just wish AMD would focus on their core demographic of users. They have phenomenal CPUs and middling GPUs, so target your demographics accordingly, i.e. good value budget and mid-tier GPUs. They had that market segment on complete lockdown during the RX 580 era, I wish they’d return to that. Hell, they figured it out with their console APUs. PS5/XSX are crazy good value. Maybe their next generation will shift that way in their PC segment.

      • LetMeEatCake@lemm.ee
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        It’s especially egregious with high end GPUs. Anyone paying >$500 for a GPU is someone that wants to enable ray tracing, let alone at a $1000. I don’t get what AMD is thinking at these price points.

        FSR being an open feature is great in many ways but long-term its hardware agnostic approach is harming AMD. They need hardware accelerated upscaling like Nvidia and even Intel. Give it some stupid name similar name (Enhanced FSR or whatever) and make it use the same software hooks so that both versions can run off the same game functions (similar to what Intel did with XeSS).

    • eldenlord@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      not to mention except north america, in almost all countries amd gpu is always $100 more expensive than nvidia counterpart making it just non sense to buy any amd card unless you are just a fanboy

  • PrivacyBean@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    In a somewhat related note. Would anyone be able to recommend a upgrade/sidegrade option to go amd over my 3080ti? Just been meaning to be done with Nvidia

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      The triple whammy of semiconductor shortage, pandemic and cryptocunts has really fucked PC gaming for a generation. The price is way out of line with the capabilities compared to a PS5.

      I’m still on a 1060 for my PC, and it’s only my GSync monitor that saves it. Variable frame rates really is great for all PC games tbh. You don’t have to frig about with settings as much because Opening Bare Area runs at 60fps, but the later Hall of a Million Alpha Effects runs at 30. You just let it rip between 40 and 80, no tearing, and fairly even frame pacing. The old “is this game looking as good as it can on my hardware while still playing smoothly?” question goes away, because you just get extra frames instead, and just knock the whole thing down one notch when it gets too bad. I’m spending more time playing and less time tweaking and that can only be a good thing.

      • Raz@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I’m just clutching my pre-covid, pre-shortage GTX 1080ti. Hoping it’ll keep powering through a little longer. Honestly, it’s an amazing card. If it ever dies on me or becomes too obsolete, I’ll frame it and hang it on my wall.

        I just wish AMD cards were better at ray tracing and “work” than Nvidia cards. Otherwise I’d have already splurged on an AMD if I could.

  • Alto@kbin.social
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    1 year ago

    Corporations aren’t your friend.

    My rig is full AMD (5800x/5700xt), but that’s purely because they happened to be the better value at the time. The second they get a lead in the consumer GPU market (which they likely will since nvidia simply doesn’t care about it vs the ML market now) prices are going to rise again.

    And don’t pretend that these prices are anything resembling affordable. That would be when you could get a legitimately mid-range card for ~$150 (rx580).

    • Spudwart@lemmy.worldOP
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Corporations aren’t your friend.

      Correct. But AMD is doing things that benefit FOSS and Linux, where as nVidia is a menace. Intel is also doing pretty decent, they just need to catch up in terms of driver features.

      • Alto@kbin.social
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Honey their x80 equivalent cards are over double what they used to be. Stop praising them for doing the bare fucking minimum.

    • BattleBeetle@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Man, I could use another $200 MSRP mid-range card. I’m also running RX5700XT (for $430!) and it’s probably going to be the first card I will use until it dies, unless there’s a reasonably priced mid-range coming out soon.

      • the post of tom joad@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I have that card too but i think it was closer to 500 when i bought it. I felt like i got a terrible price but it was better than what folks after me had to deal with. Itcs still a great card and i hope it outlasts this crazy price gouging.

        • BattleBeetle@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Mine was literally the last in stock before prices went wild when covid started. I was using rx480 and didn’t really need to upgrade until Gamers Nexus published a video about GPU shortage.

          My hope right now is for AMD to keep improving on FSR so this card can stay viable for more years.

    • Hexarei@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Yep. I’m running a Ryzen CPU for the first time as of late last year because the 5950X was on sale and Intel had no competing options anywhere near the same price. It was 16c/32t AMD for like $220 or the same core and thread count for $560 from Intel.

  • eldain@feddit.nl
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    Wait, does this mean the adrenaline software is finally out for linux? Can we undervolt/set fan curves now? The interfaceless free driver is so freaking noisy with my gpu.

    • eldain@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I checked, you guys are still celebrating the mesa code that was contributed ages ago -.- Yes it works and it’s foss. And AMD has been lazy on linux ever since, we get the bare minimum. They don’t beat nvidia by much imho.

  • Muz333@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    The fact is DLSS is really good and weird naming convention aside DLSS 3.5 (which works on all RTX cards, unlike DLSS 3) looks fantastic.

    I bought my last two cards solely for DLSS support and unless AMD steps up my next card will likely be Nvidia as well.

    • leave_it_blank@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      I tried DLSS for the first time with Control, and it looks weird. Edges and lines are unsharp, sometimes for a second, sometimes longer. It kind of looks off.

      I activated it in-game, is there something else I have to do? Am I missing something?

      • Qualanqui@lemmy.nz
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Have you tried turning off chromatic abberation, motion blur, film grain and all that other extraneous fluff?

        I have a 3060 and Control is one of my favourite games so I’ve put a lot of time into it with DLSS on but haven’t noticed what you’re describing.

        I tend to fiddle with my settings until it looks and handles well but the aforementioned settings are always turned off first thing.

        • leave_it_blank@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yes, everything except for film grain, it matches the atmosphere of Control greatly for me.

          Maybe I’m just oversensitive, the game runs at max refresh rate of my monitor anyway, so it’s not a real problem. Maybe the new DLSS will improve it.

    • Anonymousllama@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      From the tiny amount we’ve seen of it (and what the digital foundry guys were about to discuss), it looks like DLSS3.5 with ray reconstruction may actually be a game changer, pretty ray traced lighting with inbuilt anti aliasing without a performance hit. Be keen to see how it actually looks with cyberpunk when it comes out.

  • Kaito@lemmy.world
    link
    fedilink
    arrow-up
    41
    arrow-down
    2
    ·
    1 year ago

    AMD is the only real option for those of us using Linux. Nvidia’s weirdnesses regularly fill up support tickets on Linux forums it’s not even funny

    • Numpty@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I’ve been using Nvidia with Linux for a VERY long time. Currently I have computers running:

      • GT1030 - two older PC
      • GTX2060 Ti
      • GTX 3050 Ti - laptop

      They are all working fine with openSUSE Tumbleweed. I install openSUSE, add the Nvidia community repo (a couple of clicks), run updates once, and reboot. Everything just works after that. I can count maybe 3 times in the past 6 years that there was any issue at all.

      Now Ubuntu and derivative… I’ve had a LOT of issues and weirdness… drivers failing, doing weird things etc.

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      I’ve been using Linux on my desktop since 1995, have used a lot of nVidia cards and have yet to experience that weirdness you speak of.

  • gerryflap@feddit.nl
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    My problem when buying my last GPU is that AMD’s answer to CUDA, ROCm, was just miles behind and not really supported on their consumer GPUs. From what I se now that has changed for the better, but it’s still hard to trust when CUDA is so dominant and mature. I don’t want to reward NVIDIA, but I want to use my GPU for some deep learning projects too and don’t really have a choice at the moment.

    • Beefalo@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      I’ve become more and more convinced that considerations like yours, which I do not understand since I don’t rely on GPUs professionally, have been the main driver of Nvidia’s market share. It makes sense.

      The online gamer talk is that people just buy Nvidia for no good reason, it’s just internet guys refusing to do any real research because they only want a reason to stroke their own egos. This gamer-based GPU market is a loud minority whose video games don’t seem to rely too heavily on any card features for decent performance, or especially compatibility, with what they’re doing. Thus, the constant idea that people “buy Nvidia for no good reason except marketing”.

      But if AMD cards can’t really handle things like machine learning, then obviously that is a HUGE deficiency. The public probably isn’t certain of its needs when it spends $400 on a graphics card, it just notices that serious users choose Nvidia for some reason. The public buys Nvidia, just in case. Maybe they want to do something they haven’t thought of yet. I guess they’re right. The card also plays games pretty well, if that’s all they ever do.

      If you KNOW for certain that you just want to play games, then yeah, the AMD card offers a lot of bang for your buck. People aren’t that certain when they assemble a system, though, or when they buy a pre-built. I would venture that the average shopper at least entertains the idea that they might do some light video editing, the use case feels inevitable for the modern PC owner. So already they’re worrying about maybe some sort of compatibility issue with software they haven’t bought, yet. I’ve heard a lot of stories like yours, and so have they. I’ve never heard the reverse. I’ve never heard somebody say they’d like to try Nvidia but they need AMD. Never. So everyone tends to buy Nvidia.

      The people dropping the ball are the reviewers, who should be putting a LOT more emphasis on use cases like yours. People are putting a lot of money into labs for exhaustive testing of cooling fans for fuck’s sake, but just running the same old gaming benchmarks like that’s the only thing anyone will ever do with the most expensive component in the modern PC.

      I’ve also heard of some software that just does not work without CUDA. Those differences between cards should be tested and the results made public. The hardware journalism scene needs to stop focusing so hard on damned video games and start focusing on all the software where Nvidia vs AMD really does make a difference, maybe it would force AMD to step up its game. At the very least, the gamebros would stop acting like people buy Nvidia cards for no reason except some sort of weird flex.

      No, dummy, AMD can’t run a lot of important shit that you don’t care about. There’s more to this than the FPS count on Shadow of the Tomb Raider.

      • gerryflap@feddit.nl
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Well the counterpoint is that NVIDIA’s Linux drivers are famously garbage, which also pisses off professionals. From what I see from AMD now with ROCm, it seems like they’ve gone the right way. Maybe they can convince me next time I’m on the lookout for a GPU.

        But overall you’re right yeah. My feeling is that AMD is competitive with NVIDIA regarding price/performance, but NVIDIA has broader feature support. This is both in games and in professional use cases. I do feel like AMD is steadily improving in the past years though. In the gaming world FSR seems almost as ubiquitous (or maybe even more ) as DLSS, and ROCm support seems to have grown rapidly as well. Hopefully they keep going, so I’ll have a choice for my next GPU.

    • Anonymousllama@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It’s a shame there’s not really an equivalent comparison to the CUDA cores on AMD cards, being able to offload rendering to the GPU and getting instant feedback is so important when sculpting (without having to fall back to using eevee)

  • Brkdncr@artemis.camp
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    Amd has been a shitshownof a company since their beginning. Don’t believe they wouldn’t be gouging if they could.

  • leave_it_blank@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    2
    ·
    1 year ago

    I just bought my first Nvidia card since the TNT2. Up to today I always looked for the most FPS for the money.

    This time my focus was on energy efficiency, and the AMD cards suck at the moment. 4070 about 200w, 6800 about 300w. AMD really has to fix that.

    Regarding DLSS: I activated it in control, and it looks… off? Edges seem unsharp, not all the time, but often, sometimes only for a second, sometimes longer. I believe it is the only game I have that has support for it, but I’m not impressed.

    At OP: Brand loyality is the worst. Neither Nvidia nor AMD like you. Get the best value for your money.

    Btw, Nvidia needed an account to let me use their driver. Holy shit, that’s fucked up!

    • BigDaddySlim@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You don’t necessarily need an account to use the Nvidia drivers, just if you want automatic updates through GeForce Experience. Not saying that’s any better, in fact it’s almost as shitty, just wanted to clarify.

      I just used a junk email to make an account for the auto updates.

    • Phishr42@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      4070 about 200w, 6800 about 300w. AMD really has to fix that.

      But if you compare cards from the same generation, like the 3070 and 6800, they’re much closer. Nvidia still has the edge, but the 3070 TGP is 220W vs the 6800 at 250W.