this post was submitted on 07 Jul 2025
579 points (98.3% liked)

Open Source

38894 readers
59 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BeNotAfraid@lemmy.world 8 points 1 week ago* (last edited 1 week ago) (5 children)

It is basically instantaneous on my 12 year old Keppler GPU Linux Box. It is substantially less impactful on the environment than AI tar pits and other deterrents. The Cryptography happening is something almost all browsers from the last 10 years can do natively that Scrapers have to be individually programmed to do. Making it several orders of magnitude beyond impractical for every single corporate bot to be repurposed for. Only to then be rendered moot, because it's an open-source project that someone will just update the cryptographic algorithm for. These posts contain links to articles, if you read them you might answer some of your own questions and have more to contribute to the conversation.

[–] koper@feddit.nl 3 points 1 week ago (4 children)

It is basically instantaneous on my 12 year old Keppler GPU Linux Box.

It depends on what the website admin sets, but I've had checks take more than 20 seconds on my reasonably modern phone. And as scrapers get more ruthless, that difficulty setting will have to go up.

The Cryptography happening is something almost all browsers from the last 10 years can do natively that Scrapers have to be individually programmed to do. Making it several orders of magnitude beyond impractical for every single corporate bot to be repurposed for.

At best these browsers are going to have some efficient CPU implementation. Scrapers can send these challenges off to dedicated GPU farms or even FPGAs, which are an order of magnitude faster and more efficient. This is also not complex, a team of engineers could set this up in a few days.

Only to then be rendered moot, because it's an open-source project that someone will just update the cryptographic algorithm for.

There might be something in changing to a better, GPU resistant algorithm like argon2, but browsers don't support those natively so you would rely on an even less efficient implementation in js or wasm. Quickly changing details of the algorithm in a game of whack-a-mole could work to an extent, but that would turn this into an arms race. And the scrapers can afford far more development time than the maintainers of Anubis.

These posts contain links to articles, if you read them you might answer some of your own questions and have more to contribute to the conversation.

This is very condescending. I would prefer if you would just engage with my arguments.

[–] BeNotAfraid@lemmy.world 1 points 1 week ago* (last edited 6 days ago) (1 children)

At best these browsers are going to have some efficient CPU implementation.

Means absolutely nothing in context to what I said, or any information contained in this article. Does not relate to anything I originally replied to.

Scrapers can send these challenges off to dedicated GPU farms or even FPGAs, which are an order of magnitude faster and more efficient.

Not what's happening here. Be Serious.

I would prefer if you would just engage with my arguments.

I did, your arguments are bad and you're being intellectually disingenuous.

This is very condescending.

Yeah, that's the point. Very Astute

[–] koper@feddit.nl 0 points 6 days ago

If you're deliberately belittling me I won't engage. Goodbye.

load more comments (2 replies)
load more comments (2 replies)