this post was submitted on 02 Mar 2025
26 points (88.2% liked)
GenZedong
4455 readers
123 users here now
This is a Dengist community in favor of Bashar al-Assad with no information that can lead to the arrest of Hillary Clinton, our fellow liberal and queen. This community is not ironic. We are Marxists-Leninists.
This community is for posts about Marxism and geopolitics (including shitposts to some extent). Serious posts can be posted here or in /c/GenZhou. Reactionary or ultra-leftist cringe posts belong in /c/shitreactionariessay or /c/shitultrassay respectively.
We have a Matrix homeserver and a Matrix space. See this thread for more information. If you believe the server may be down, check the status on status.elara.ws.
Rules:
- No bigotry, anti-communism, pro-imperialism or ultra-leftism (anti-AES)
- We support indigenous liberation as the primary contradiction in settler colonies like the US, Canada, Australia, New Zealand and Israel
- If you post an archived link (excluding archive.org), include the URL of the original article as well
- Unless it's an obvious shitpost, include relevant sources
- For articles behind paywalls, try to include the text in the post
- Mark all posts containing NSFW images as NSFW (including things like Nazi imagery)
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think this is a perfect illustration of how technology ends up being applied under different social and economic systems. The reason AI is problematic in the west is due to the fact that it's applied towards finding ways to increase the wealth of the oligarchs. On the other hand, China is using AI for stuff like industry automation, optimizing government workflow, and so on. AI is just a tool for automating work, there's nothing inherently bad about it. The question is how this tool is applied and to what purposes.
I'm not so sure about that. Your analysis correctly identifies that it is being used in the West for nefarious purposes, but I honestly think even on the technical merits it is a flawed technology and a waste. DeepSeek is more efficient, yes, but it is still a flawed technology that I do not believe they should be using
I wouldn't say AI (or pattern-replicating models resembling AI) is flawed. It's a great tool for saving time and automating certain processes.
The problem is the myriad of grifters who appeared, mostly in the West, trying to sell it as a cure-all snake oil.
For instance, there's a massive push in the EU to insert AI in education, but with little regard or planning on how to do it effectively. It would be a great tool if we were to feed AI with our curriculi, then ask it to update it to current knowledge (e.g. in science), come up with suggestions for better delivery of certain topics, eliminate time wasted on erroneous, repeating, or useless topics and improve our schedules for other topics (e.g. teaching Romeo and Juliet in Languages, and at the same time go through the history of 1400s Venice in History). These things could be done using commitees over a 5 year period. Or they could be done by AI in a day. Instead though, we get to have handsomely-paid private contractors organize days-long training sessions over how to use AI to draw a picture, because it might make a presentation to students slightly more exciting.
Honestly even your idea of having an LLM "update" a curriculum just makes me annoyed. Why does everyone automatically give authority to an LLM on perhaps one of the most important societal functions, instead of trusting teachers to do their job, with the decades of experience that they have in teaching?
Is this what we want? AI generated slop for teaching the next generation because it'll get it done in a day?
Current LLMs have access to much more information than even the best teacher can hope to remember. I already know high school chemistry teachers who are using Perplexity (LLM-enhanced search engine) to plan lesson plans because Perplexity is able to search up all the latest on a topic.
Obviously, a teacher still needs to be in the loop somewhere to ensure a bar of quality on the lesson plan, but there's no reason why AI can't just be an enhanced search engine that makes their planning easier, and can help to prevent students from being taught outdated information. I bet you can remember some fact that you learned in school that ended up being wrong later.
For example, the tongue zone taste map you may have seen in anatomy class is completely wrong. In fact, each taste bud is able to sample all the tastes with variations between each taste bud.
Good points