56
basic UI programming in linux
(infosec.pub)
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
It is far more efficient to ask specific questions instead of reading the whole documentation. Asking those with relevant knowledge of the field is usually not an option. Asking GPT is an option we now have. Why would you not like it? It is like having Excel instead of a calculator and paper.
It takes the fun out of programming
You don't learn as well when you have someone/something else do the thinking for you. It's nice to NOT have to keep going back to an LLM for answers.
I learn even less if the effort required is far too high to even try. GPT reduces this a whole lot, enabling me (and presumably many others) to do things they were unable to do previously.
I really do not understand how this community is so toxic regarding this.
I'm guessing it's because you're surrounded by people who DID spend the extra effort to learn something on their own without having their hand held, and now just see people trying to take the easy way out.
You're not unique. We were all in your position once.
Define "without having their hand held". Did they come up with all concepts themselves? Do they exclusively code in assembly? Wire their machines by hand? Operate the switches manually? Push the button off the Morse machine themselves? How far back should I go with the analogies before it is clear how nonsensical that is? I am a random hobbyist that is enabled to do such stuff because of GPT. I would not have been able to replace a broken BMS chip in my e-bike battery without GPT helping me digest the datasheet and get the register, programming procedure etc. etc. into code to read the old and write the new chip. I am not 15 anymore, I can not spend 50 hours learning some niche skill that I will never(!) use again just to fix something that is worth 200 $.
If you think that anyone can do that with GPT you are not only mistaken but at the same time I am shocked that you would not want that to be the case, just out of pettiness that you could not do it as easy but "had to learn it the hard way back in the day". Disgusting.
I don't care what you do, you do you. I just like actually knowing things when I need to know them, and have the capacity to solve problems myself without being dependent on tech for everything. It's like being able to figure out how to change your own engine oil vs. paying somebody to do it for you.
We read books. We went to classes. We got our hands dirty and failed, again and again and again until it clicked and we got it right. That's the part that's hard. LLMs are a tool. Not a replacement for a good programmer who understands what they are doing. Use them to help you save time with tasks you are already familiar with. Don't use them as a college professor. Because eventually it's going to teach you wrong, that's how they work. And without knowing some basic concepts about the subject you're inquiring about, you're not going to catch it when it does go wrong.
I'm 42 by the way, and I still learn new things every day.
I'm going to bring up an excerpt of your previous comment, because this is an example I want to make. Say there is something in that datasheet (I'm completely making this up as an example) about needing a certain value resistor to set the charging current, and ChatGPT fails to mention this and simply tells you that the battery takes the voltage directly from the circuit without it? Then you have a fire on your hands, because you decided to NOT to read the datasheet and skip crucial info. If you keep taking AI generated text at face value, it's going to bite you in the ass one day.
Electronics is my main hobby, so you can bet I'm poring over datasheets all day too, and little gotchas like that are all over the place. You simply cannot trust them with these things the way you can trust a good old book or someone that's been doing it for a long time.
The first 2 paragraphs read a bit odd. I mean I specifically said that it is a tool that saves time and not what you put in my mouth. That is actually the whole point I made. The same way a book saves time compared to going somewhere, hearing about it and writing it down. Or using interactive programs instead of having to compile and upload code. Or using Python instead of C++ or C++ instead of assembly. Or assembly instead of straight binary or connecting wires or a punch card.
I also specifically say that someone without prior knowledge is not going to be able to do that. The same way someone who does not understand math is not going to be able to use a calculator or Excel in an effective way.
To take the oil change example, it is like a tutorial on how to do it yourself. But you still need to have a jack, lay on the floor, unscrew etc. But instead of having to go to a shop and learn it there, you learn it directly, which is more effective. Like reading a book about assembly instead of looking over the shoulder of the person inventing assembly. Errors can always happen and I have to say, given how much GPT improved over just 1.5 years, we are soon in the situation Wikipedia was back in the day. "Wikipedia can be edited by everyone, you can't trust it" while in reality it was already more reliable than the encyclopedias it was getting compared to.