675
Mozilla lays off 60 people, wants to build AI into Firefox
(arstechnica.com)
This is a most excellent place for technology news and articles.
why the fuck would I need an AI in a browser? 0 fucks given for this "feature". firefox is devolving into an edge.
Nowadays we are supposed to need AI everywhere. I'm waiting for my AI bidet so that I can chat with it when I do my business.
"What is my purpose?"
"You wash my asshole when I poop"
"Oh my god"
Edge? Think Clippy.
"It looks like you are browsing porn. Can I help?"
That could be useful... Find more X person/stuff....
I think an AI that finds porn across the entire web that meets specific search criteria or is like an example would be a hit.
You already have AI in Firefox - local translations for example. Developing local AI aligns perfectly well with Mozilla's goals, but it seems people panic as soon as they see the two letters together.
Desperate to gain marketshare, fucking samsung to apple. I hate it and I have no other options left after Firefox is enshittified
Theoretically I can imagine AI in the browser to be awesome to combat AI on the web. Let the AI wars begin!
You really have no fuckdamn naive your statement is. You don't want an AI war and we cannot avoid one.
I know there is currently a massive PR campaign for a power grab to consolidate control over AI software. They want to control the means of generation. Only MozillAI can save us from King GhAIdorah!
Sorry I'm upsetting you. I know we're entering an acceleration of technology at a time where our institutions globally are in an absolutely horrendous state. People on all sides are brainwashed as hell. The AI watchdogs are insane as well. What's left but gallows humor? I do hold out some hope though.
You cannot upset me more than the current common misunderstandings that everyone has about AI already does.
I don't think you understand the implications of undetectable AI to shift social conversation or the kind of world that those AI owners want to create.
That might actually be the kind of thing where open source AI could help. At least I hope. To detect bias, lies or AI powered filtering / sorting of content.
Ok so this is one of the naive thoughts that makes me upset.
The open source community can't even make a distro of linux that is out of the box functional for everyday users and you think somehow they are going to be able to outcompete billion dollar companies that can afford the best gear and devs?
Look, I bought in heavy to open source early on in the 90s, and have done my best to go open source for every tool I can, but the simple fact is that even the 'best' open source projects are severely lacking in aspects and YOU CAN'T TRUST DEVELOPMENT OF AI TO THAT.
Compare The Gimp to Photoshop. It isn't even close, why? Because Adobe has a fucktonne of cash to throw at their projects and they have clear direction and motivation.
I don't like it
I'd prefer a fully open source world
But it isn't going to happen, and open source AI will always lag behind corporate AI, and considering how fast it has been developing, even being 3 months behind renders a tool useless as an AI detector.
We aren't prepared for this and 90% of what everyone on the internet says about AI is poorly informed and full of confabulation, and WORST of all, when you try and explain this to them they get antagonistic.
We have already seen the threat AI can pose in 2016 with Cambridge Analytica helping to hand trumpty dumpty the election by using AI to focus target vulnerable facebook groups.
AND THAT AI WAS A FUCKING INFANT compared to what we have now.
It's going to be so bad and almost none of you have the slightest clue.
See, THIS is the criticism of AI I can actually empathize with, I might even agree with it somewhat
Honestly, most of what Cambridge analytica did was blackmail, illegal spending, and collusion between campaigns that were legally required to be separate.
Much of the data processing/ml was intended as a smoke screen to distract from the big stuff that was known to work and consequently legislated against. The problem is that they were so incompetent that the distraction technique was also illegal.
Maybe the machine learning also worked, but it's really not clear.