[-] noli@programming.dev 16 points 6 months ago

arxiv.org is a thing

[-] noli@programming.dev 10 points 7 months ago

I was garbage collector for 3 years as student job.

Not a single person in the company has done such a thing. We did dumb shit all the time but that wasn't even close. I feel sorry that you decided that 1 person represents a whole industry

[-] noli@programming.dev 10 points 7 months ago* (last edited 7 months ago)

You get most of the day off after having worked a full day + still need a full and thorough shower + have low energy because it's a physically demanding job

Src: did it for 3 years as student job

[-] noli@programming.dev 13 points 7 months ago

Fun fact: the main reason garbage collection is unhealthy is because you spend all day in the fumes of public roads & the truck you're hanging behind.

Source: was garbage collector for 3 years as student job & we got a small hourly bonus for it.

[-] noli@programming.dev 12 points 8 months ago

The oddity is having a military recruiter for a specific school. All we had was a stand for the military at a fair with all studies options (context: belgium)

[-] noli@programming.dev 11 points 9 months ago

Oops, so silly of me!

On a totally unrelated note, this is also a funny meme. Hope it helps with your disappointment of this post!

[-] noli@programming.dev 12 points 9 months ago

A coffee from a coffee shop definitely should be $4 if you want them to ethically source good coffee and have a sustainable business model.

There's still cheap, shitty coffee that's built on modern slavery there's always like mcdo. SB is the same quality ingredients but with knowing how to steam milk + syrups

[-] noli@programming.dev 17 points 10 months ago

To answer that question it might be useful to ask a different question: "If people depend on money to survive and if that money is made through manual labour. Does this imply that manual labour is slavery through coercion?"

[-] noli@programming.dev 17 points 11 months ago

Flipping burgers is enough to pay for chemotherapy. Src: am european

[-] noli@programming.dev 11 points 11 months ago

While I get your point, you're still slightly misguided.

Sometimes for a smaller dataset an algorithm with worse asymptomatic complexity can be faster.

Some examples:

  • Radix sort's complexity is linear. Then why would most people still want to use e.g. quicksort? Because for relatively smaller datasets, the overhead of radix sort overpowers the gain from being asymptotically faster.
  • One of the most common and well-known optimizations for quicksort is to switch over to insertion sort when subarray sizes become smaller than a certain size. This is because for small datasets (I'm talking e.g. 10 elements) insertion sort is just objectively faster.

Big O notation only considers the largest factor. It is still important to consider the lower order factors in some cases. Assume the theoretical time complexity for an algorithm A is 2nlog(n) + 999999999n and for algorithm B it is n^2 + 7n. Clearly with a small n, B will always be faster, even though B is O(n^2) and A is O(nlog(n)).

Sorting is actually a great example to show how you should always consider what your data looks like before deciding which algorithm to use, which is one of the biggest takeaways I had from my data structures & algorithms class.

This youtube channel also has a fairly nice three-part series on the topic of sorting algorithms: https://youtu.be/_KhZ7F-jOlI?si=7o0Ub7bn8Y9g1fDx

[-] noli@programming.dev 11 points 1 year ago

It's always fixable, just not always worth the effort

[-] noli@programming.dev 17 points 1 year ago

You disgust me

view more: ‹ prev next ›

noli

joined 1 year ago