94

I followed these steps, but just so happened to check on my mason jar 3-4 days in and saw tiny carbonation bubbles rapidly rising throughout.

I thought that may just be part of the process but double checked with a Google search on day 7 (when there were no bubbles in the container at all).

Turns out I had just grew a botulism culture and garlic in olive oil specifically is a fairly common way to grow this bio-toxins.

Had I not checked on it 3-4 days in I'd have been none the wiser and would have Darwinned my entire family.

Prompt with care and never trust AI dear people...

you are viewing a single comment's thread
view the rest of the comments
[-] snooggums@midwest.social 21 points 5 months ago

I am saying that coining it as a term was stupid and intended to make it sound intelligent when it isn't.

[-] dgerard@awful.systems 11 points 5 months ago

oh definitely, it's fucking terrible question-begging. I'd like to know when it traces back to, and how good faith it was or wasn't

[-] acausal_masochist@awful.systems 4 points 4 months ago

It originally comes from false positives in computer vision afaik, where it makes some sense as the model is "seeing" things that aren't in the image.

[-] Dirk@lemmy.ml 1 points 5 months ago

Of course is the term stupid. Neither is an LLM an AI, nor is any AI in the current state intelligent. In the end it all boils down to being answer machines. Complex ones, but still far away from anything even remotely being am AI.

this post was submitted on 18 Jun 2024
94 points (100.0% liked)

TechTakes

1401 readers
117 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS