[-] professor_entropy@lemmy.world 6 points 1 year ago* (last edited 1 year ago)

FWIW It's not clear cut if AI generated data feeding back into further training reduces accuracy, or is generally harmful.

Multiple papers have shown that generated images by high quality diffusion models with a proportion of real images in mix (30-50%) improve the adversarial robustness of the models. Similiar things might apply to language modeling.

[-] professor_entropy@lemmy.world 2 points 1 year ago* (last edited 1 year ago)
[-] professor_entropy@lemmy.world 16 points 1 year ago

Good news everyone!

[-] professor_entropy@lemmy.world 16 points 1 year ago

Quick reminder that all futurama episodes are available for free on archive.

[-] professor_entropy@lemmy.world 0 points 1 year ago

Good point but the community I like may be on another instance which would prevent similar community to grown elsewhere. If I get invested in it I run the risk of losing access to it.

[-] professor_entropy@lemmy.world 2 points 1 year ago

Why does Lemmy make it look harder than it is? It's not a massive load compared to what modern servers and applications are designed to handle.

I couldn't sign up on beehaw and lemmy.ml after multiple tries. It feels worse than a simple centralised platform one can build in a month.

Is there alternative to reddit for people like me who don't need this kind of decentralisation (Lemmy feels like centralisation, just multiple number of it, if any instance can cut off like this.) but likes the (text heavy)interface of Lemmy?

1

What other technological marvels do we have?

professor_entropy

joined 1 year ago