[-] theluddite@lemmy.ml 43 points 9 months ago* (last edited 9 months ago)

I have worked at two different start ups where the boss explicitly didn't want to hire anyone with kids and had to be informed that there are laws about that, so yes, definitely anti-parent. One of them also kept saying that they only wanted employees like our autistic coworker when we asked him why he had spent weeks rejecting every interviewee that we had liked. Don't even get me started on people that the CEO wouldn't have a beer with, and how often they just so happen to be women or foreigners! Just gross shit all around.

It's very clear when you work closely with founders that they see their businesses as a moral good in the world, and as a result, they have a lot of entitlement about their relationship with labor. They view laws about it as inconveniences on their moral imperative to grow the startup.

[-] theluddite@lemmy.ml 54 points 10 months ago

It's probably either waiting for approval to sell ads or was denied and they're adding more stuff. Google has a virtual monopoly on ads, and their approval process can take 1-2 weeks. Google's content policy basially demands that your site by full of generated trash to sell ads. I did a case study here, in which Google denied my popular and useful website for ads until I filled it with the lowest-quality generated trash imaginable. That might help clarify what's up.

[-] theluddite@lemmy.ml 54 points 10 months ago* (last edited 10 months ago)

It's not a solution, but as a mitigation, I'm trying to push the idea of an internet right of way into the public consciousness. Here's the thesis statement from my write-up:

I propose that if a company wants to grow by allowing open access to its services to the public, then that access should create a legal right of way. Any features that were open to users cannot then be closed off so long as the company remains operational. We need an Internet Rights of Way Act, which enforces digital footpaths. Companies shouldn't be allowed to create little paths into their sites, only to delete them, forcing guests to pay if they wish to maintain access to the networks that they built, the posts that they wrote, or whatever else it is that they were doing there.

As I explain in the link, rights of way already exist for the physical world, so it's easily explained to even the less technically inclined, and give us a useful legal framework for how they should work.

[-] theluddite@lemmy.ml 53 points 1 year ago

I do software consulting for a living. A lot of my practice is small organizations hiring me because their entire tech stack is a bunch of shortcuts taped together into one giant teetering monument to moving as fast as possible, and they managed to do all of that while still having to write every line of code.

In 3-4 years, I'm going to be hearing from clients about how they hired an undergrad who was really into AI to do the core of their codebase and everyone is afraid to even log into the server because the slightest breeze might collapse the entire thing.

LLM coding is going to be like every other industrial automation process in our society. We can now make a shittier thing way faster, without thinking of the consequences.

[-] theluddite@lemmy.ml 48 points 1 year ago* (last edited 1 year ago)

I'd say less than a week. Capitalism is something that we have to wake up and make happen every single day. How many days worth of food does the average person have? Definitely not 45 days. People would have to start self-organizing within 2-3 days, and in doing so, they would actively make something that isn't capitalism, which directly challenges those in power.

This is why every time there are emergencies or protests, the media is obsessed with "looting." If there's no food because of a hurricane or whatever, it is every single person's duty to redistribute what there is equitably. The news and capitalists (but I repeat myself) call that "looting," even when it's a well-organized group of neighbors going into a closed store to distribute spoiling food to hungry people.

Rebecca Solnit writes about this in detail in A Paradise Built in Hell. It's really good. She's an awesome writer.

[-] theluddite@lemmy.ml 55 points 1 year ago

The purpose of a system is what it does. "There is no point in claiming that the purpose of a system is to do what it constantly fails to do.” These articles about how social media is broken are constant. It's just not a useful way to think about it. For example:

It relies on badly maintained social-media infrastructure and is presided over by billionaires who have given up on the premise that their platforms should inform users

These platforms are systems. They don't have intent. There's no mens rea or anything. There is no point saying that social media is supposed to inform users when it constantly fails to inform users. In fact, it has never informed users.

Any serious discussion about social media must accept that the system is what it is, not that it's supposed to be some other way, and is currently suffering some anomaly.

[-] theluddite@lemmy.ml 43 points 1 year ago

That sucks, but I argue that it's even worse. Not only do they tweak your results to make more money, but because google has a monopoly on web advertising, and (like it or not) advertising is the main internet funding model, google gets to decide whether or not your website gets to generate revenue at all. They literally have an approval process for serving ads, and it is responsible for the proliferation of LLM-generated blogspam. Here's a thing I wrote about it in which I tried to get my already-useful and high-quality website approved for ads, complete with a before and after approval, if you're curious. The after is a wreck.

[-] theluddite@lemmy.ml 51 points 1 year ago* (last edited 1 year ago)

I think there's a simpler, more personal way to make this point. Here's a few thought experiments:

Imagine you work for a company that lays you off, even while doing enough stock buybacks and executive bonuses such that they could've paid your salary for 1000 years. After you get laid off, imagine what would happen if you just ignored them and continued doing your work.

Or, your landlord doesn't renew your lease because they think you're ugly and they don't want ugly people living in their building. Imagine what happens if you just stay, even if you keep sending the landlord their monthly rent on time.

Both of these situations end with armed, taxpayer-funded agents physically removing you from the premises by any means necessary; it is only the omnipresent threat of state violence that keeps capitalist control over their private property. We don't see the violence because we've been trained from an early age not just to accept it, but to not even see it.

[-] theluddite@lemmy.ml 43 points 1 year ago* (last edited 1 year ago)

"Capitalism is just human nature."

If it's just human nature, then why do we need a militarized police force to enforce order? Having workers go to a workplace, do labor, and then send the profits to some far away entity that probably isn't even there is actually very far from human nature. It's something that necessarily requires the implied threat of violence to maintain. Same with tenants and landlords. No one would pay rent if it wasn't for the police, who will use violence to throw you out otherwise.

It also frustrates me how that argument just waves away the incredibly complex and actually extremely arbitrary legal structure of capitalism. What about human nature contains limited liability for artificial legal entities controlled by shareholders? "Ah yes, here's the part of the human genome that expresses preferred and common stock; here's the part that contains the innate human desire for quarterly earnings calls."

edit: typo

[-] theluddite@lemmy.ml 42 points 1 year ago

Is that really all they do though? That's what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn't even be possible to start on DIY videos and end on white supremacy or whatever.

I wrote a longer version of this argument here, if you're curious.

[-] theluddite@lemmy.ml 58 points 1 year ago

This study is an agent-based simulation:

The researchers used a type of math called “agent-based modeling” to simulate how people’s opinions change over time. They focused on a model where individuals can believe the truth, the fake information, or remain undecided. The researchers created a network of connections between these individuals, similar to how people are connected on social media.

They used the binary agreement model to understand the “tipping point” (the point where a small change can lead to significant effects) and how disinformation can spread.

Personally, I love agent-based models. I think agent modeling is a very, very powerful tool for systems insight, but I don't like this article's interpretation, nor am I convinced the author of this article really groks what agent-based modeling really is. It's a very different kind of "study" than what most people mean when they use that word, and interpreting the insights is its own can of worms.

Just a heads up, for those of you casually scrolling by.

[-] theluddite@lemmy.ml 60 points 1 year ago

The real problem with LLM coding, in my opinion, is something much more fundamental than whether it can code correctly or not. One of the biggest problems coding faces right now is code bloat. In my 15 years writing code, I write so much less code now than when I started, and spend so much more time bolting together existing libraries, dealing with CI/CD bullshit, and all the other hair that software projects has started to grow.

The amount of code is exploding. Nowadays, every website uses ReactJS. Every single tiny website loads god knows how many libraries. Just the other day, I forked and built an open source project that had a simple web front end (a list view, some forms -- basic shit), and after building it, npm informed me that it had over a dozen critical vulnerabilities, and dozens more of high severity. I think the total was something like 70?

All code now has to be written at least once. With ChatGPT, it doesn't even need to be written once! We can generate arbitrary amounts of code all the time whenever we want! We're going to have so much fucking code, and we have absolutely no idea how to deal with that.

view more: ‹ prev next ›

theluddite

joined 1 year ago