I was garbage collector for 3 years as student job.
Not a single person in the company has done such a thing. We did dumb shit all the time but that wasn't even close. I feel sorry that you decided that 1 person represents a whole industry
I was garbage collector for 3 years as student job.
Not a single person in the company has done such a thing. We did dumb shit all the time but that wasn't even close. I feel sorry that you decided that 1 person represents a whole industry
You get most of the day off after having worked a full day + still need a full and thorough shower + have low energy because it's a physically demanding job
Src: did it for 3 years as student job
Fun fact: the main reason garbage collection is unhealthy is because you spend all day in the fumes of public roads & the truck you're hanging behind.
Source: was garbage collector for 3 years as student job & we got a small hourly bonus for it.
The oddity is having a military recruiter for a specific school. All we had was a stand for the military at a fair with all studies options (context: belgium)
Oops, so silly of me!
On a totally unrelated note, this is also a funny meme. Hope it helps with your disappointment of this post!
A coffee from a coffee shop definitely should be $4 if you want them to ethically source good coffee and have a sustainable business model.
There's still cheap, shitty coffee that's built on modern slavery there's always like mcdo. SB is the same quality ingredients but with knowing how to steam milk + syrups
To answer that question it might be useful to ask a different question: "If people depend on money to survive and if that money is made through manual labour. Does this imply that manual labour is slavery through coercion?"
Flipping burgers is enough to pay for chemotherapy. Src: am european
While I get your point, you're still slightly misguided.
Sometimes for a smaller dataset an algorithm with worse asymptomatic complexity can be faster.
Some examples:
Big O notation only considers the largest factor. It is still important to consider the lower order factors in some cases. Assume the theoretical time complexity for an algorithm A is 2nlog(n) + 999999999n and for algorithm B it is n^2 + 7n. Clearly with a small n, B will always be faster, even though B is O(n^2) and A is O(nlog(n)).
Sorting is actually a great example to show how you should always consider what your data looks like before deciding which algorithm to use, which is one of the biggest takeaways I had from my data structures & algorithms class.
This youtube channel also has a fairly nice three-part series on the topic of sorting algorithms: https://youtu.be/_KhZ7F-jOlI?si=7o0Ub7bn8Y9g1fDx
It's always fixable, just not always worth the effort
You disgust me
arxiv.org is a thing