By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sales - Nvidia becomes the world's first $4 trillion company, CEO world's 18th richest person

G2ThaUNiT said:
Alex_The_Hedgehog said:

All my GPUs so far were from Nvidia, and I have no complaints.

My current GPU is a 3050, and so far it's working really good, I even managed to run some more demanding games without major issues. And it wasn't very expensive.

That's when Nvidia was really starting to take AI seriously. Their most recent 50 series cards are barely a bump horsepower wise compared to the 40 series cards. The big upgrades lie in the AI tech. 

Yeah, I heard the 50 series is not worth the upgrade, specially if you have a high-end 30 or 40 series.



Around the Network
Alex_The_Hedgehog said:
G2ThaUNiT said:

That's when Nvidia was really starting to take AI seriously. Their most recent 50 series cards are barely a bump horsepower wise compared to the 40 series cards. The big upgrades lie in the AI tech. 

Yeah, I heard the 50 series is not worth the upgrade, specially if you have a high-end 30 or 40 series.

Yeah, and because of Nvidia's overfocus on their AI tech, even on high end 50 series cards, you can't run graphically demanding games at max settings at 60fps. Hell, not even 30fps in some cases. Like, you HAVE to use the AI tech if you want competent performance. Which is terrible to see. 



You called down the thunder, now reap the whirlwind

G2ThaUNiT said:
Alex_The_Hedgehog said:

Yeah, I heard the 50 series is not worth the upgrade, specially if you have a high-end 30 or 40 series.

Yeah, and because of Nvidia's overfocus on their AI tech, even on high end 50 series cards, you can't run graphically demanding games at max settings at 60fps. Hell, not even 30fps in some cases. Like, you HAVE to use the AI tech if you want competent performance. Which is terrible to see. 

It's not really an overfocus on AI, the market there is very real for them. They will continue to grow exponentially considering every major US corporation is investing in AI, which relies on both their hardware and proprietary CUDA software.

Gamers just have to accept that the future for them might have to be moreso with AMD and Intel moving forward.

Nvidia will probably keep making consumer GPUs since the brand loyalty towards them is still ridiculous. If they have the spare parts to scrap together it's just free money on the table. But they're never going to offer great value towards to the consumer when they know they can coast on their high brand awareness and scarcity of products.



Shaunodon said:

Gamers just have to accept that the future for them might have to be moreso with AMD and Intel moving forward.

Not only that, but neural rendering is almost certainly going to be the future of game rendering, and AMD/Intel would do well to invest heavily (maybe in cooperation with each-other) in an open-source SDK that enables it on their GPUs if they want to be competitive with Nvidia. So far AMD's position (until very recently) has been to hope that deep-learning assisted game rendering will be a fad, and Intel is even less forward-thinking than AMD. 



sc94597 said:
Shaunodon said:

Gamers just have to accept that the future for them might have to be moreso with AMD and Intel moving forward.

Not only that, but neural rendering is almost certainly going to be the future of game rendering, and AMD/Intel would do well to invest heavily (maybe in cooperation with each-other) in an open-source SDK that enables it on their GPUs if they want to be competitive with Nvidia. So far AMD's position (until very recently) has been to hope that deep-learning assisted game rendering will be a fad, and Intel is even less forward-thinking than AMD. 

AMD and Intel will have to take AI more seriously moving foward, especially their software counterparts don't even come close to Nvidia's CUDA.

They won't be able to shake Nvidia's monopoly on the greater AI development front, but at least giving them some competition there, while down here for gamers they will be able to offer a lot more value.



Around the Network
CaptainExplosion said:
G2ThaUNiT said:

Nvidia for the most part, has been able to avoid layoffs. I believe they laid off like 300 at the beginning of the year, but that's pretty much been it for idk how many years. Nvidia is a hardware company though, unlike the likes of Microsoft or Google, who are primarily software companies. So they need the developers and engineers for their hardware since they focus on AI in hardware computing. Whereas Microsoft and Google are trying to go for AI in software competing with the likes of ChatGPT.

Nvidia is also a lot smaller of a company than most tech giants. Microsoft and Google have around 200,000 employees. Apple has over 160,000. Nvidia only has 36,000. They don't have overhire in positions that aren't needed. So it's helped them stay well above profitable (clearly with a $4 trillion market cap) to the point they don't need to have layoffs.

How long that momentum lasts? Who knows.

Well I wanna see all generative AI servers crash. They're putting human artists, writers and voice actors in economic turmoil.

As well as turning up the heat even more

https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117

Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organization for Economic Co-operation and Development.

By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatts (which would bump data centers up to fifth place on the global list, between Japan and Russia).


While electricity demands of data centers may be getting the most attention in research literature, the amount of water consumed by these facilities has environmental impacts, as well.

Chilled water is used to cool a data center by absorbing heat from computing equipment. It has been estimated that, for each kilowatt hour of energy a data center consumes, it would need two liters of water for cooling, says Bashir.


There are also environmental implications of obtaining the raw materials used to fabricate GPUs, which can involve dirty mining procedures and the use of toxic chemicals for processing.

Market research firm Tech Insights estimates that the three major producers (NVIDIA, AMD, and Intel) shipped 3.85 million GPUs to data centers in 2023, up from about 2.67 million in 2022. That number is expected to have increased by an even greater percentage in 2024.


But hey, maybe AI will be able to predict the next flood/tornado/hurricane/drought more accurately...






SvennoJ said:
CaptainExplosion said:

Well I wanna see all generative AI servers crash. They're putting human artists, writers and voice actors in economic turmoil.

As well as turning up the heat even more

https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117

Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organization for Economic Co-operation and Development.

By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatts (which would bump data centers up to fifth place on the global list, between Japan and Russia).


While electricity demands of data centers may be getting the most attention in research literature, the amount of water consumed by these facilities has environmental impacts, as well.

Chilled water is used to cool a data center by absorbing heat from computing equipment. It has been estimated that, for each kilowatt hour of energy a data center consumes, it would need two liters of water for cooling, says Bashir.


There are also environmental implications of obtaining the raw materials used to fabricate GPUs, which can involve dirty mining procedures and the use of toxic chemicals for processing.

Market research firm Tech Insights estimates that the three major producers (NVIDIA, AMD, and Intel) shipped 3.85 million GPUs to data centers in 2023, up from about 2.67 million in 2022. That number is expected to have increased by an even greater percentage in 2024.


But hey, maybe AI will be able to predict the next flood/tornado/hurricane/drought more accurately...



We're really good at finding ways to destroy our lifehood just in the hopes of making it better.

But oh well, these corpos aren't about sustainability because chasing that dollar is simply the name of their game ...



Switch Friend Code : 3905-6122-2909