27

this is AI but it felt a lot more guy with broken gear

you are viewing a single comment's thread
view the rest of the comments
[-] V0ldek@awful.systems 18 points 6 months ago

While I agree mostly with the blunt of the thesis - 80% of the job is reading bad code and unfucking it, and ChatGPT sucks in all the ways - I disagree with the conclusions.

First, gen AI shifting us towards analysing more bad code to unfuck is not a good thing. It's quite specifically bad. We really don't need more bad code generators. What we need are good docs, slapping genAI as a band-aid for badly documented libraries will do more harm than good. The absolute last thing I want is genAI feeding me with more bullshit to deal with.

Second, this all comes across as an industrialist view on education. I'm sure Big Tech would very much like people to just be good at fixing and maintaining their legacy software, or shipping new bland products as quick as possible, but that's not why we should be giving people a CS education. You already need investigation skills to debug your own code. That 90% of industry work is not creative building of new amazing software doesn't at all mean education should lean that way. 90% of industry jobs don't require novel applications of algebra or analytical geometry either, and people have been complaining that "school teaches you useless things like algebra or trigonometry" for ages.

This infiltration of industry into academia is always a deleterious influence, and genAI is a great illustration of that. We now have Big Tech weirdos giving keynotes on CS conferences about how everyone should work in AI because it's The Future™. Because education is perpetually underfunded, it heavily depends on industry money. But the tech industry is an infinite growth machine; it doesn't care about any philosophical considerations with regards to education; it doesn't care about science in any way other than as a product to be packaged and shipped ASAP to grow revenue, doesn't matter if it's actually good, useful, sustainable, or anything like that. They invested billions into growing a specialised sector of CS with novel hardware and all (see TPUs) to be able to multiply matrices really fast, and the chief uses of that are Facebook's ad recommendation system and now ChatGPT.

This central conclusion just sucks from my perspective:

It’s how human programmers, increasingly, add value.

“Figure out why the code we already have isn’t doing the thing, or is doing the weird thing, and how to bring the code more into line with the things we want it to do.”

While yes, this is why even a "run-of-the-mill" job as a programmer is not likely to be outsourced to an ML model, that's definitely not we should aspire the value added to be. People add value because they are creative builders! You don't need a higher education to be able to patch up garbage codebases all week, the same way you don't need any algebra or trigonometry to work at a random paper-pushing job. What you do need it to is to become the person that writes the existing code in the first place. There's a reason these are Computer Science programmes and not "Programming @ Big Tech" programmes.

[-] FredFig@awful.systems 7 points 6 months ago* (last edited 6 months ago)

From the pov of a slightly exhausted prof who just wants a short-ish answer for her students, the conclusion sorta makes sense, I guess. The students want to convince themselves they aren't wasting their time with genAI and she's not in a position to convince them otherwise, so the next best thing is showing them what industrial life with genAI will be like.

"The future you're dreaming of sucks, so get used to it." isn't a satisfying answer, but its a forced perspective.

load more comments (10 replies)
this post was submitted on 27 May 2024
27 points (100.0% liked)

FreeAssembly

75 readers
1 users here now

this is FreeAssembly, a non-toxic design, programming, and art collective. post your share-alike (CC SA, GPL, BSD, or similar) projects here! collaboration is welcome, and mutual education is too.

in brief, this community is the awful.systems answer to Hacker News. read this article for a solid summary of why having a less toxic collaborative community is important from a technical standpoint in addition to a social one.

some posting guidelines apply in addition to the typical awful.systems stuff:

(logo credit, with modifications by @dgerard@awful.systems)

founded 7 months ago
MODERATORS