They have enough datacenters buying their products, they don’t need consumers so they can wait more than the average user…
I personally am done with gaming because if I must spend a thousand euros, I’ll spend it in bicicle parts and accessories and not on one single piece of hardware for a decent PC.
Graphical fidelity has not materially improved since the days of Crysis 1, 16 years ago. The only two meaningful changes for how difficult games should be to run in that time are that 1440p & 2160p have become more common, and raytracing. But consoles being content to run at dynamic resolutions and 30fps combined with tools developed to make raytracting palatable (DLSS) have made developers complacent to have their games run like absolute garbage even on mid spec hardware that should have no trouble running 1080p/60fps.
Destiny 2 was famously well optimized at launch. I was running an easy 1440p/120fps in pretty much all scenarios maxed out on a 1080 Ti. The more new zones come out, the worse performance seems to be in each, even though I now have a 3090.
I am loving BG3 but the entire city in act 3 can barely run 40fps on a 3090, and it is not an especially gorgeous looking game. The only thing I can really imagine is that maxed out the character models and armor models do look quite nice. But a lot of environment art is extremely low-poly. I should not have to turn on DLSS to get playable framerates in a game like this with a Titan class card.
Nvidia and AMD just keep cranking the power on the cards, they’re now 3+ slot behemoths to deal with all the heat, which also means cranking the price. They also seem to think 30fps is acceptable, which it just… is not. Especially not in first person games.
Graphical fidelity has not materially improved since the days of Crysis 1
I think you may have rose tinted glasses on this point, the level of detail in environments and accuracy of shading, especially of dynamic objects, has increased greatly. Material shading has also gotten insanely good compared to what we had then. Just peep the PBR materials on guns in modern FPS games, it’s incredible, Crysis just had normals and specular maps all black or grey guns that are kinda shiny and normal mapped. If you went inside of a small building or whatever there was hardly any shading or shadows to make it look right either.
Crysis is a very clever use of what was available to make it look good, but we can do a hell of a lot better now (without raytracing) At the time shaders were getting really computationally cheap to implement so those still look relatively good, but geometry and framebuffer size just did not keep pace at all, tesselation was the next hotness after that because it was supposed to help fix the limited geometry horsepower contemporary cards had by utilizing their extremely powerful shader cores to do some of the heavy lifting. Just look at the rocks in Crysis compared to the foliage and it’s really obvious this was the case. Bad Company 2 is another good example of good shaders with really crushingly limited geometry though there are clever workarounds there to make it look pretty good still.
I could see the argument that the juice isn’t worth the squeeze to you, but graphics have very noticeably advanced in that time.
Why would datacenters be buying consumer grade cards? Nvidia has the A series cards for enterprise that are basically identical to consumer ones but with features useful for enterprise unlocked.
They don’t but Nvidia diverts some of its production towards gaming chips. If nobody buys graphics cards they can easily keep a stash of those chips to sell down the line and go 100% on datacenter products.
They have enough datacenters buying their products, they don’t need consumers so they can wait more than the average user…
I personally am done with gaming because if I must spend a thousand euros, I’ll spend it in bicicle parts and accessories and not on one single piece of hardware for a decent PC.
There are also games that don’t render a square mile of a city in photorealistic quality.
Graphical fidelity has not materially improved since the days of Crysis 1, 16 years ago. The only two meaningful changes for how difficult games should be to run in that time are that 1440p & 2160p have become more common, and raytracing. But consoles being content to run at dynamic resolutions and 30fps combined with tools developed to make raytracting palatable (DLSS) have made developers complacent to have their games run like absolute garbage even on mid spec hardware that should have no trouble running 1080p/60fps.
Destiny 2 was famously well optimized at launch. I was running an easy 1440p/120fps in pretty much all scenarios maxed out on a 1080 Ti. The more new zones come out, the worse performance seems to be in each, even though I now have a 3090.
I am loving BG3 but the entire city in act 3 can barely run 40fps on a 3090, and it is not an especially gorgeous looking game. The only thing I can really imagine is that maxed out the character models and armor models do look quite nice. But a lot of environment art is extremely low-poly. I should not have to turn on DLSS to get playable framerates in a game like this with a Titan class card.
Nvidia and AMD just keep cranking the power on the cards, they’re now 3+ slot behemoths to deal with all the heat, which also means cranking the price. They also seem to think 30fps is acceptable, which it just… is not. Especially not in first person games.
I think you may have rose tinted glasses on this point, the level of detail in environments and accuracy of shading, especially of dynamic objects, has increased greatly. Material shading has also gotten insanely good compared to what we had then. Just peep the PBR materials on guns in modern FPS games, it’s incredible, Crysis just had normals and specular maps all black or grey guns that are kinda shiny and normal mapped. If you went inside of a small building or whatever there was hardly any shading or shadows to make it look right either.
Crysis is a very clever use of what was available to make it look good, but we can do a hell of a lot better now (without raytracing) At the time shaders were getting really computationally cheap to implement so those still look relatively good, but geometry and framebuffer size just did not keep pace at all, tesselation was the next hotness after that because it was supposed to help fix the limited geometry horsepower contemporary cards had by utilizing their extremely powerful shader cores to do some of the heavy lifting. Just look at the rocks in Crysis compared to the foliage and it’s really obvious this was the case. Bad Company 2 is another good example of good shaders with really crushingly limited geometry though there are clever workarounds there to make it look pretty good still.
I could see the argument that the juice isn’t worth the squeeze to you, but graphics have very noticeably advanced in that time.
Why would datacenters be buying consumer grade cards? Nvidia has the A series cards for enterprise that are basically identical to consumer ones but with features useful for enterprise unlocked.
They don’t but Nvidia diverts some of its production towards gaming chips. If nobody buys graphics cards they can easily keep a stash of those chips to sell down the line and go 100% on datacenter products.