this post was submitted on 10 Oct 2025
636 points (99.2% liked)
Programmer Humor
26898 readers
1058 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But if that's how you're going to run it, why not also train it in that mode?
That is a thing, and it's called quantization aware training. Some open weight models like Gemma do it.
The problem is that you need to re-train the whole model for that, and if you also want a full-quality version you need to train a lot more.
It is still less precise, so it'll still be worse quality than full precision, but it does reduce the effect.
Your response reeks of AI slop
4/10 bait
Is it, or is it not, AI slop? Why are you using so heavily markdown formatting? That is a telltale sign of an LLM being involved
They used one formatting mark, and it's the most common. What are you smoking, and may I have some?
I am not using an llm but holy bait
Hop off the reddit voice
...You do know what platform you're on? It's a REDDIT alternative