this post was submitted on 20 Mar 2025
362 points (99.7% liked)

Open Source

34868 readers
810 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] beeng@discuss.tchncs.de 14 points 1 day ago (4 children)

You'd think these centralised LLM search providers would be caching a lot of this stuff, eg perplexity or claude.

[–] droplet6585@lemmy.ml 39 points 1 day ago (1 children)

There's two prongs to this

  1. Caching is an optimization strategy used by legitimate software engineers. AI dorks are anything but.

  2. Crippling information sources outside of service means information is more easily "found" inside the service.

So if it was ever a bug, it's now a feature.

[–] jacksilver@lemmy.world 16 points 1 day ago

Third prong, looking constantly for new information. Yeah, most of these sites may be basically static, but it's probably cheaper and easier to just constantly recrawl things.

load more comments (2 replies)