447
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 12 Aug 2024
447 points (98.3% liked)
Open Source
31077 readers
597 users here now
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Useful Links
- Open Source Initiative
- Free Software Foundation
- Electronic Frontier Foundation
- Software Freedom Conservancy
- It's FOSS
- Android FOSS Apps Megathread
Rules
- Posts must be relevant to the open source ideology
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
- !libre_culture@lemmy.ml
- !libre_software@lemmy.ml
- !libre_hardware@lemmy.ml
- !linux@lemmy.ml
- !technology@lemmy.ml
Community icon from opensource.org, but we are not affiliated with them.
founded 5 years ago
MODERATORS
Fair warning that this would chew through a ton of bandwidth if you run it often, so only do it if you don't have bandwidth caps.
It really depends. Once every 1-5 minutes, sure, maybe. Once every 1-5 hours tho? You're likely fine.
True, although once per hour would still be a lot of data.
For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if ran hourly.
Once every 6hrs would only be 180GB. A script that does it every six hours, but then increases the frequency if it goes below a certain threshold, could work well. I guess it all depends on how accurate you need the data to be.