this post was submitted on 21 Aug 2025
1348 points (95.4% liked)
memes
16997 readers
4964 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads/AI Slop
No advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- !tenforward@lemmy.world : Star Trek memes, chat and shitposts
- !lemmyshitpost@lemmy.world : Lemmy Shitposts, anything and everything goes.
- !linuxmemes@lemmy.world : Linux themed memes
- !comicstrips@lemmy.world : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How does computers work?
Edit: I tried to do this all off the top of my head. After writing this, I think I meant user space vs kernel space. Idk if user land is a word
Did a solid effort.
I guess between C and assembly there’s abstract syntax trees and maybe LLVM, which is probably also written in C. Idk I skipped compilers in college.
I also know the networking stack has a bunch of layers, but that felt like its own separate thing to “computers”. I think UDP makes more errors than TCP but UDP also go brrrrr
Hehe, llvm is a compiler framework, basically provides all the utilities for processing an AST.
ASTs have various flavors but they're all the same thing an intermediate representation for a program that optimizers and linkers use to create binaries.
The network stacks meh, 6 or 7 layers depending on what protocol you use but in brief: physical, transport, application. More and more functionality has moved into the transport in the name of efficiency, see quic. But in general not worth worrying about most of the abstraction was nonsense anyways.
And you missed out compilers was one of the most useful classes in cs circulums since it teaches you how languages work.
I KNOW! My biggest mistake in college was worrying about my GPA and that worry keeping me from taking harder classes. But I did learn about ASTs.
Transistors are only on and off switches when run in saturation. This is relevant to CPUs in the sense that the rising/falling edge and jitter affect the setup and hold times and thus the maximum clock rate. End pedantry.
This is the content I’m here for! Please continue I want to learn more
There's an active region between on and off where the current from the collector to emmiter is proportional to the base current. This can be used in other applications like amplifiers. But in digital applications that active region is the transition time between low and high states.
In order to obtain a deterministic outcome the rising edge must be predictable and it must stay at a logic level 1 for long enough to account for propagation delay. These considerations are known as setup and hold. The higher the frequency the clock runs, the tighter these constraints become.
Nice 👍🏽!
Machin code comes to mind, and "more" high level languages like C++, template metaprogramming and other horror stories 💀
And CD players!
Cheers 😋
My impression of C++ is that’s it’s actually C++++++++ as in, how many more decades of features can we cram into this language before it explodes
What’s a CD player /s
Fun fact about a random CD player. The USB-A external CD player Apple sold after removing the internal CD player kinda abused the USB standard. I believe it needed more current than was allowed by USB, so Apple found some way to make this specific device draw more power than the USB standard supported at the time. Today, I believe USB-C includes a handshake that negotiates power requirements, but at the time, USB-A didn’t support this.
Tbh, I don’t really know where assembly ends and machine code starts. But do know that assembly is tied to your specific architecture
You're not wrong about C++ 😋
Machine code is just the numbers, assembler is mnemonics and stuff and needs an "interpreter" to turn it into useful machine code (a C++ compiler also spits out machine code BTW).
Spot on about USB standards, no idea if apple did what you saulid though, wouldn't doubt it!
Spicy rocks