Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tegra & later Shield were attempts to get closer to full end user platform. The Nintendo Switch is their most successful such device — with a 2-year old Tegra SKU at launch. But going full force into consumer tech is a distraction for them right now. Even the enthusiast graphics market, which should be high margin, is losing their interest. They make much more selling to the big enterprise customer CEO Jensen mentions in the open paragraph.


Gamers are going to be so pissed. They subsided the advance in GPU compute and will now be ignored for the much more lucrative enterprise AI customers.

Nvidia is making the right call, of course.


Have they ever considered that the subsidy goes the other way? The margins on an A100 card is probably 100% higher than a RTX4090. Gaming industry is also like THE first industry to be revolutionized by AI. Current stuff like DLSS and AI-accelerated path tracing are mere toys compared to what will come.

Nvidia will not give up gaming. When every gamer has a Nvidia card, every potential AI developer to spring up from those gamers, will use Nvidia by default. It also helps gaming GPUs are still lucrative.


> Gaming industry is also like THE first industry to be revolutionized by AI.

That's a great counter point.

> Nvidia will not give up gaming. When every gamer has a Nvidia card, every potential AI developer to spring up from those gamers, will use Nvidia by default. It also helps gaming GPUs are still lucrative.

Another. But Nvidia will have a lot of balancing to do and some very thirsty competitors. Though if competition arises, that too is good for gamers.


Nah, volume sales & the ability to bin. A100 would be a much more expensive product if they couldn't sell defective chips as consumer GPUs. Pretty sure that the R&D cost of the workstation cards means that those cards are technically sold at a loss with Nvidia knowing that consumer sales will make up for it.


It's OK, gaming is also having its AI moment.

I fully expect future rendering techniques to lean heavily on AI for the final scene. NeRF, diffusion models, et cetera are the thin end of the wedge.


Gamers are in heaven right now. Used 30-series cards are cheap as dirt, keeping the pressure on Intel/AMD/Apple to price their GPUs competitively. The 40-series cards are a hedged bet against anything their competitors can develop - manufactured at great cost on TSMC's 4nm node and priced out-of-reach for most users. Still, it's clear that Nvidia isn't holding out their best stuff, just charging exorbitant amounts for it.


Where are these cheap as dirt 30 series? A 10gb 3080 is still over $500 usd used ($750 aud) when I’ve looked. When did secondhand GPUs that still cost the same as a brand new PS5 start to be considered cheap?


A PS5 is _significantly_ slower than a 3080, it's more RTX 2070 tier.


Sure but the 3080 is a single component while the PS5 includes everything needed to run the game. The way GPU prices have inflated over the past couple generations has been absurd.


Yes, but it is a complete gaming system. My point is $500 is still a lot of money for a used GPU that is nearly 3 years old. Jut retailed brand new for $699. So yes, prices were crazy over that period for a variety of reasons, but that shouldn’t shift the gaming value proposition so dramatically.


> My point is $500 is still a lot of money for a used GPU that is nearly 3 years old.

Yeah. It's good hardware. You can get cheaper cards (even cost-competitive options) on PC but Nvidia won't sell them to you. Especially not now that they're got 10 billion dollars on their TSMC tab.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: