NVIDIA leaves us hanging: No trace of the RTX 50 Super at CES ??
-
For the first time in 5 years, NVIDIA has not announced any new graphics cards at CES 2026. The RTX 50 Super series was rumored to be coming, but it seems that the reality is different. These would be the main reasons:
-
AI: NVIDIA is focused on selling chips for Artificial Intelligence and has put the gaming market on hold. They prefer to use their factories for server chips rather than our humble graphics cards.
-
Memory crisis: They say there is a brutal shortage of GDDR7 memory (the one these Super would use) and that the costs are skyrocketing.
-
The "Plan B": Instead of hardware, they have focused on software by releasing DLSS 4.5 and G-Sync Pulsar.
-
AMD on the prowl: Meanwhile, AMD has shown its teeth with its new Ryzen 9000X3D and has dropped hints about the future Radeon RX 9000.
Original source: Club386 / Tom's Hardware
What do you think? Do you think NVIDIA is forgetting about gamers or is it just a temporary setback due to the lack of components?
Regards!!
-
-
Since the pandemic, Nvidia has gone from hit to hit: first with cryptocurrency mining and then with AI. I don't think it has forgotten about gamers. Here are a couple of issues: the business opportunity and the brand image. When shareholders are in charge, fabric is the only thing that matters. AI is living a bubble that will eventually burst, but while the fever lasts, consumer hardware in general, not just graphics cards, is relegated to the background. On the brand image, there is a representative example of what is happening which is the abandonment of the Crucial brand by Micron. The reasons given are always the same (AI demands everything), but here there are also long-term marketing movements. Micron knows that the consumer memory market is dead and has decided that it doesn't want to participate in the game of selling 32GB kits for €500 (when after the summer they were at just over €100). The current situation has pissed off a lot of people and manufacturers have to be very careful with what they offer: some (like Nvidia) decide not to release new products if they can't offer prices that aren't insulting and others (like Micron) decide to directly withdraw (and they will refound Crucial or any other brand when the storm passes). @_Neptunno_ said in NVIDIA leaves us stranded: No trace of the RTX 50 Super at CES
:
In summary, I think it's a temporary setback. The AI fever has numbered days, regardless of whether it has inaugurated a new era that will last until the next disruptive technology. Whether it bursts next week or in 5 years is the question.What do you think? Do you think NVIDIA is forgetting about gamers or is it just a temporary setback due to the lack of components?
-
Well, that's a good question. As a happy user of AMD both in graphics and processor, I must say that Nvidia has, as of today, the best gaming graphics card that exists and I'm not referring to the RTX5090, I'm referring to the RTX5080. The best graphics card from the competition, RX9070XT, competes with the RX5070TI and to be fair, the 5070 is faster although also much more expensive. I have the RX9070 and although it's a graphics card that blows your mind, it doesn't come close to Nvidia in terms of technology. And mind you, FSR 4.0 is amazing and a huge leap compared to FSR 3.0 and the performance in Ray Tracing is more than enough to be very enjoyable (unlike the RX6000 which was the graphics card I had before). But even so, Nvidia is ahead.
The only thing Nvidia sucks at is supporting its older graphics cards on Linux. When they move them to the Legacy driver, those graphics cards become almost obsolete (the support for Wayland on older graphics cards is non-existent and soon those graphics cards won't be usable on modern distros). And I, as a Linux user, rule out buying Nvidia for that reason. Nowadays, the use of the old GCN graphics cards continues to hold their own with Vulkan thanks to the support given to them on Linux by the community and because AMD publishes their drivers in open source under the MIT license.
I don't know, I think Nvidia takes advantage of the circumstances but it's still a giant that has CUDA under a proprietary license and has practically turned it into an industry standard and that gives them an advantage through a de facto monopoly. But they are still the best graphics cards, they still have the best technologies, the best software and the best implementation protocols in AI. That's a fact...