this post was submitted on 18 Aug 2024
163 points (89.4% liked)

AI Generated Images

7845 readers
22 users here now

Community for AI image generation. Any models are allowed. Creativity is valuable! It is recommended to post the model used for reference, but not a rule.

No explicit violence, gore, or nudity.

This is not a NSFW community although exceptions are sometimes made. Any NSFW posts must be marked as NSFW and may be removed at any moderator's discretion. Any suggestive imagery may be removed at any time.

Refer to https://lemmynsfw.com/ for any NSFW imagery.

No misconduct: Harassment, Abuse or assault, Bullying, Illegal activity, Discrimination, Racism, Trolling, Bigotry.

AI Generated Videos are allowed under the same rules. Photosensitivity warning required for any flashing videos.

To embed images type:

“![](put image url in here)”

Follow all sh.itjust.works rules.


Community Challenge Past Entries

Related communities:

founded 2 years ago
MODERATORS
 
all 30 comments
sorted by: hot top controversial new old
[–] activ8r@sh.itjust.works 77 points 10 months ago (1 children)

Didn't realise what community I was in and got really confused.

[–] Atelopus-zeteki@fedia.io 5 points 10 months ago

It gets funnier the more you look. I love the 'right handed' spoon.

[–] finn_der_mensch@discuss.tchncs.de 38 points 10 months ago (1 children)

I think this is a very good and funny example of how ai makes stuff look right (at first glance at least) but lacks any type of sense about what stuff means

[–] p03locke@lemmy.dbzer0.com 2 points 10 months ago

Honestly, if it could get this sort of thing right, it would already have enough cognitive function for us to be scared of self-awareness.

Following or designing step-by-step instructions requires a lot of intelligence.

[–] breadsmasher@lemmy.world 20 points 10 months ago (2 children)

Im a bit stuck on step 3, can anyone help?

[–] wurstgulasch3000@lemmy.world 25 points 10 months ago (3 children)

Do you mean step 3 or step 3 or step 3 or step 8/3 or step 3?

[–] RockaiE@lemmy.world 4 points 10 months ago

Don't forget about steps ɝ, ҙ and Ӟ

[–] Atelopus-zeteki@fedia.io 3 points 10 months ago
[–] Aggravationstation@feddit.uk 6 points 10 months ago (1 children)

Mayonnaise obviously. The secret to a really great sandwich.

[–] metaStatic@kbin.earth 3 points 10 months ago (1 children)

That's Elmer's Glue. Common mistake.

[–] Atelopus-zeteki@fedia.io 2 points 10 months ago

Both so the sandwich sticks together, and for a snadwich that will stick to your ribs.

[–] mannycalavera@feddit.uk 13 points 10 months ago

There was an attempt.

[–] Dave@lemmy.nz 13 points 10 months ago

I don't know why but I chuckled at this more than anything else I've seen on the internet today. Maybe just laughing at my dumb brain trying to work it out before seeing the text and realising it was AI.

[–] Iheartcheese@lemmy.world 10 points 10 months ago
[–] AsakuraMao@moist.catsweat.com 8 points 10 months ago (1 children)

Instructions unclear, penis now trapped in peanut butter jar

[–] Atelopus-zeteki@fedia.io 2 points 10 months ago

I got stuck at that step, too! Welp, this is our life now.

[–] zarkanian@sh.itjust.works 6 points 10 months ago (1 children)

Mmmm, peanut butter, cranberry jelly, and mayo. My favorite!

[–] Marduk73@sh.itjust.works 1 points 10 months ago (1 children)

Im hoping the mayo is cake frosting. Would be too sweet but at least matches ingredients better.

[–] Numuruzero@lemmy.dbzer0.com 1 points 10 months ago

I came to the conclusion that it was a delightful meringue

[–] cmhe@lemmy.world 6 points 10 months ago* (last edited 10 months ago) (4 children)

A lot of 3s and no 7. Does AI has a bias on what numbers they create?

Like if I generate 1000 pictures with a number between 0 and 9, are those numbers distributed equally or what would the distribution look like?

Humans, when ask to say random numbers also have biases in some circumstances, so I guess AI does too.

[–] hinterlufer@lemmy.world 5 points 10 months ago

When I asked gemini to randomly arrange the numbers between 4 and 27, it spit out a seemingly correct list of numbers with the issue that 23 was randomly missing

[–] jacksilver@lemmy.world 2 points 10 months ago

LLM based technology has been shown to have biases in randomness, there was an article a while back experimenting with coin flips showing a lack of true randomness.

Its cause it's about token prediction, so their is forced priority behind the scenes whether or not that is visible to the user.

Its the same reason why when you ask an image generator to create "a person from India" you get a man in a turbin a majority of the time.

[–] OpenStars@discuss.online 1 points 10 months ago

4 also appears 3 times, but that number 3 isn't always a number 3 - especially the bottom right kinda looks like a negative 3, and the one left of it...

[–] lnxtx@feddit.nl 3 points 10 months ago (1 children)
[–] hemko@lemmy.dbzer0.com 5 points 10 months ago (1 children)
[–] lugal@sopuli.xyz 2 points 10 months ago

The secret ingredient

Guys, is this Loss?

[–] unemployedclaquer@sopuli.xyz 0 points 10 months ago

How could an AI know you shouldn't combine PB with J? unless the J is homemade preserves. Did anyone tell the AI about homemade fucking preserves? Garbage in, garbage out. Ya'll!