By using this site, you agree to our Privacy Policy and our Terms of Use. Close
CaptainExplosion said:
G2ThaUNiT said:

Nvidia for the most part, has been able to avoid layoffs. I believe they laid off like 300 at the beginning of the year, but that's pretty much been it for idk how many years. Nvidia is a hardware company though, unlike the likes of Microsoft or Google, who are primarily software companies. So they need the developers and engineers for their hardware since they focus on AI in hardware computing. Whereas Microsoft and Google are trying to go for AI in software competing with the likes of ChatGPT.

Nvidia is also a lot smaller of a company than most tech giants. Microsoft and Google have around 200,000 employees. Apple has over 160,000. Nvidia only has 36,000. They don't have overhire in positions that aren't needed. So it's helped them stay well above profitable (clearly with a $4 trillion market cap) to the point they don't need to have layoffs.

How long that momentum lasts? Who knows.

Well I wanna see all generative AI servers crash. They're putting human artists, writers and voice actors in economic turmoil.

As well as turning up the heat even more

https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117

Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organization for Economic Co-operation and Development.

By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatts (which would bump data centers up to fifth place on the global list, between Japan and Russia).


While electricity demands of data centers may be getting the most attention in research literature, the amount of water consumed by these facilities has environmental impacts, as well.

Chilled water is used to cool a data center by absorbing heat from computing equipment. It has been estimated that, for each kilowatt hour of energy a data center consumes, it would need two liters of water for cooling, says Bashir.


There are also environmental implications of obtaining the raw materials used to fabricate GPUs, which can involve dirty mining procedures and the use of toxic chemicals for processing.

Market research firm Tech Insights estimates that the three major producers (NVIDIA, AMD, and Intel) shipped 3.85 million GPUs to data centers in 2023, up from about 2.67 million in 2022. That number is expected to have increased by an even greater percentage in 2024.


But hey, maybe AI will be able to predict the next flood/tornado/hurricane/drought more accurately...