By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General - AI art thread/funny creations

70's pimp, ghettofabulous attire at 150%



Around the Network



Greta gives a pouty look, wonder if she's thinking about her next speech or stunt



Random picture of a family from California

Random picture of a family from Texas

Stereotypical inbred family, lol

Family of Hamas militants

A mormon's underwear...... I was trying to get a mormon wearing that odd underwear that the men have to avoid masturbating, but this is what i got.  This thing has belt loops, and i dunno what you'd need that for with in underwear...... unless that was the shorts he was gonna wear in a gay club or something 

A televangelist family...... look! There is an angel in the background, or perhaps Gawd!

Last edited by shavenferret - on 22 July 2025

Zkuq said:
Mummelmann said:

This channel is full of hilarious videos made by AI, I suspect some shenanigans with the prompts to make it over the top, but the entertainment is top class! The ones with concerts are always fun, but sports are hilarious too. Rodeo and football done by this AI-channel is next-level silly.

Of course, these aren't my creations, but I felt it was worth sharing anyway.

This is both really impressive and very demonstrative of how AI doesn't actually understand anything at all.

Is that AI doesn't understand or the person imputing the data did not fully understand what they wanted.  Garbage in you get garbage out.  Not saying that the AI model used here would have done things better but saying you really do not know what was used for this video so you cannot assume the AI did not understand the assignment.  This could have been the desired results.



Around the Network

@shavenferret "Stereotypical inbred family" Stereotypical?

Thanks for nothing, you just sent me down an inbred (consanguinity) rabbit hole ... I did not know before today that just how inbred a person is is measured as an "inbreeding coefficient". Inbreeding (2nd cousin or closer) is also far more prevalent in today's society than I would have guessed focused heavily in northern Africa and the Middle East. Rates of 20 to 50% according to the National Library of Medicine.

If stereotypical was one of your prompts I'd say someone was messing with the data.

Last edited by The_Yoda - on 23 July 2025

The_Yoda said:

@shavenferret "Stereotypical inbred family" Stereotypical?

Thanks for nothing, you just sent me down an inbred (consanguinity) rabbit hole ... I did not know before today that just how inbred a person is is measured as an "inbreeding coefficient". Inbreeding (2nd cousin or closer) is also far more prevalent in today's society than I would have guessed focused heavily in northern Africa and the Middle East. Rates of 20 to 50% according to the National Library of Medicine.

Look up the inbreeding stats of pakistan.... it's about as bad as Alabama is. Also, it's quite unfortunate that the Muslims don't prohibited this from the Quran, so it seems that there isn't as much of a reason to avoid it .

Finally, have you seen the video of that family that I'd so inbred that they all devolved and walk on all 4's? I think that they might be from Pakistan. Also, the American Whitakers are a hit, check them out as well. 



Machiavellian said:
Zkuq said:

This is both really impressive and very demonstrative of how AI doesn't actually understand anything at all.

Is that AI doesn't understand or the person imputing the data did not fully understand what they wanted.  Garbage in you get garbage out.  Not saying that the AI model used here would have done things better but saying you really do not know what was used for this video so you cannot assume the AI did not understand the assignment.  This could have been the desired results.

Looks like what was seemed typical of particularly earlier AI models, so I'm just going to assume it's AI doing its best at portraying diving, without actually understanding how a human works, especially when diving. My guess is that the training material doesn't include certain camera angles and people in certain angles, so the AI just doesn't know what exactly to do, and because it doesn't properly understand people or diving, it guesses incorrectly.

Admittedly I could be wrong, but this seems exactly like what I would expect AI to do when the training data isn't that comprehensive. I'm sure the training data is very impressive, but it's hard to cover absolutely everything.



lonely artistic guy, 42

this one looks like something that you think you'd see on grindr 



Each state's favorite drug