By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
curl-6 said:

The legal stuff and others works done by AI are plagued with errors though because LLMs hallucinate and can't differentiate true from false data.

Given the flipside is deepfake revenge/child porn, rampant misinformation, a pretense for suits to lay off workers, slop infesting every corner of the internet, scams, environmental destruction, prices for RAM and stuff going through the roof, a bubble that threatens to crash the economy and more, I'd say the bad still far outweighs any good.

It's amusing that people keep repeating this since they have no idea how much water is used by typical agricultural and industrial processes. A large datacenter consumes roughly as much water as the beef used by six burger joints, and golf courses in the US consume around 25x more of it than all datacenters combined. Not to mention the runoff from a datacenter is warmer water... the runoff from a blue jeans factory will literally kill you.

As for errors, retrieval-augmented generation has by large reduced the rate of hallucinations in the past year or so. Even for offline mode, the hallucination rate changes considerably across models. They can actually learn to distinguish between what is encoded in their weights and what is not, though no one besides maybe Anthropic is paying much mind to solving it in reinforcement learning.

Golf courses or factories being bad for the environment too doesn't change the fact that AI also is. 

As someone who lives in the suburbs of a major city, I'm already inhaling car smog a lot of the time, but that doesn't mean that I should just say "fuck it" and become a pack a day smoker as well cos "I'm already damaging my lungs".