100
submitted 1 year ago by Vent@lemm.ee to c/datahoarder@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] Moonrise2473@feddit.it 3 points 1 year ago

But I wonder: doesn't it need to be accessible to be read locally? If I mine like 1 petabytes of stuff, then I can upload somewhere else and forget about it?

Otherwise they could mine on a disk, then wipe, start again.

IMHO they found a scapegoat, everyone (me included) loves to blame crypto bros for anything bad, but I don't see how here can happen

[-] thews@lemmy.world 2 points 1 year ago

Emulate a block device and reference it to the cloud api, unless im missing something.

[-] Moonrise2473@feddit.it 2 points 1 year ago

Yes but it should be needed to read it constantly, otherwise it would download petabytes of stuff

And that mined file would be accessed slowly

this post was submitted on 25 Aug 2023
100 points (93.9% liked)

datahoarder

6716 readers
22 users here now

Who are we?

We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.

We are one. We are legion. And we're trying really hard not to forget.

-- 5-4-3-2-1-bang from this thread

founded 4 years ago
MODERATORS