Eh, have you seen some of the bots that just repost reddit content verbatim? Some trash slips through the cracks.
We've got DVDFab bots here on Lemmy now?
Turns out those cameras at the self serve registers and customer info from online orders already collate plenty of sellable customer information and you don't even need to tie a loyalty programme to them.
50% of 0 is still 0, so they're not wrong.
I used a Hisense A5 Pro CC phone for a few months as my daily driver. For books, colour eink is okay at best but yeah, contrast sucks. It pretty much always will with the extra layers of filtering needed for each colour.
Outside of static pages of text and images, you pretty much need to drop colour depth to pretty garish levels for a decently responsive user experience. It's a nice idea but really isn't very good in practice.
I have a Milk-V Mars but it really isn't performant enough for any task I have for an SBC. Distro support seems to be a pain too, as the provided Debian image isn't meant to run on repos aside from a Debian snapshot from 2022.
I really do hope things improve. I'm planning on moving over to an RK3588 ARM board for desktop daily drivering but one day I'm hoping a decently affordable RISC V alternative will turn up.
There already exist plugins for Peertube that allow cryptocurrency integration.
Setting up a Ko-fi is still the best option to get monetisation going on Fedi.
Skull and Bones is already a AAAA according to Ubisoft, so we're already part of the way there.
There's no money in privacy.
Harvesting and selling personal information is practically a continual source of funds with little to no cost. Why spend time and money developing a product with all the data harvesting elements stripped out to appeals to maybe 5-10% of the market?
It would be nice to see a few more distro options for GPD's machines, especially considering Ubuntu MATE is the only real option for ones like the Win 2. Anything else is a pain to install due to all the weird hardware they use.
They're Ryzen processors with "AI" accelerators, so an LLM can definitely run on hardware on one of those. Other options are available, like lower powered ARM chipsets (RK3588-based boards) with accelerators that might have half the performance but are far cheaper to run, should be enough for a basic LLM.