And then OP used valgrind
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
I did not knew this existed, so thanks for the tip.
have fun (ノಠ益ಠ)ノ彡┻━┻
Back when I was a kid and was learning C, I used to wonder why people considered pointers hard.
My usage of pointers was like:
void func (int * arg1)
{
// do sth with arg1
}
int main ()
{
int x;
func (&x);
return 0;
}
I didn't know stuff like malloc
and never felt the need in any of the program logic for the little thingies I made.
Pointers are not hard. Memory management makes it hard.
C makes them unnecessarily confusing in my opinion. In Forth they're as simple as can be compared to C.
It'll be fun when you get to funny errors because you used free
d memory.
When I was learning about linked lists and decided to use them in a project, I "removed" items by making the previous item's next
point to this item's next
, except I misplaced a call to free
before using the fields, and it somehow still worked most of the time on debug builds, but on optimized builds it would cause a segmentation fault 100% of the time.
Unused memory is wasted memory
Cloud providers LOVE you with this one quick trick!
Also goes for mobile. You use more memory and apps get killed.
and with a good enough leak, the amount of unused memory will become negative!
Valgrind to the rescue
Not freeing your memory at all is a memory management strategy. I think some LaTeX compilers use it as well as surprisingly many Java applications.
This non-sarcastically. The operating system is better at cleaning up memory than you, and it's completely pointless to free all your allocations if you're about to exit the program. For certain workloads, it can lead to cleaner, less buggy code to not free anything.
It's important to know the difference between a "memory leak" and unfreed memory. A leak refers to memory that cannot be freed because you lost track of the address to it. Leaks are only really a problem if the amount of leaked memory is unbounded or huge. Every scenario is different.
Of course, that's not an excuse to be sloppy with memory management. You should only ever fail to free memory intentionally.
Absolutely. I once wrote a server for a factory machine that spawned child processes to work each job item. Intentionally we did not free any memory in the child process because it serves only one request and then exits anyway. It’s much more efficient to have the OS just clean up everything and provides strong guarantees that nothing can be left behind accidentally for a system where up time was money. Any code to manage memory was pointless line noise and extra developer effort.
In fact I think in the linker we specifically replaced free with a function that does nothing.
Yeah, you can use the Epsilon garbage collector in Java for a no-op garbage collection strategy.
For short-lived programs that do not risk hitting any memory constraints, it makes a lot of sense - zero time wasted on cleanup during execution, then you just do it all at the end when killing the program, which you can do in constant time since you don't need to reason about what should remain or not.
Upvoted. This is something I learned rather recently. Sometimes it's more performant to slowly leak than it would be to free properly. Then take x amount of time to restart every n amount of time.
A middle ground is a memory pool or an object pool where you reuse the memory rather than free it. Instead, you free it all in one operation when that phase of your application is complete. I’ve seen this done for particle systems to reduce overhead.
.net
Anything I run in C# or similar seems to allocate 512GB of virtual address space and then just populates what it actually uses.
RAII.
Can’t leak what never leaves the stack frame.
Isn’t this for C++?
Classes are just pretentious structs.
How do you get destructor behavior in C?
You call the destructor. It’s simply not automatically done for you with the concept of going out of scope.
Back when C++ was simply a text pre-processor for C, you could see these normal function calls. You can still see them in the un-optimized disassembly. There’s nothing magical about a destructor other than it being inserted automatically.
being inserted automatically.
Aka the entire point of RAII
You haven't lived until you've produced a memory leak in JavaScript.