Son lo mismo. Cada uno usa discursos ligeramente distintos dependiendo de la zona (entiendo que en chile la opinión pública de la dictadura es muy distinta a la de acá), pero están bancados por la misma gente y traen el mismo plan económico de subordinación. Son los cóndores que volvieron.
Oh he is. There was a US-backed dictatorship in the 70's that killed 30.000 people, he's actively downplaying it as just a "conflict between two sides", using the exact same words as the military did in those days. Also what he's saying isn't new, it's repeating things from the '90s that already had us in crippling debt, with no political institutions and lots of police brutality. He's only doing so good because the main political parties are having a major internal leadership crisis. We managed to save ourselves in the early 2000's, we may not get that lucky again.
Where did you get that picture of me?
That man looks like he's in his natural 20s
...then don't study computer science. I study CS and it's annoying when someone in a more math/logic oriented course is like "If I get a job at a tech company I won't need this". All that IS computer science, if you just wanna code, learn to code.
What's wrong with GNU?
Those are valid points and make some practical sense, but I've talked too much with mathematicians about this so let me give you another point of view.
First of all, we do modular arithmetic with integers, not natural numbers, same with all those objects you listed.
On the first point, we are not talking about 0 as a digit but as a number. The main argument against 0 being in N is more a philosophical one. What are we looking at when we study N? What is this set? "The integers starting from 0" seems a bit of a weird definition. Historically, the natural numbers always were the counting numbers, and that doesn't include 0 because you can't have 0 apples, so when we talk about N we're talking about the counting numbers. That's just the consensus where I'm from, if it's more practical to include 0 in whatever you are doing, you use N~0~. Also the axiomatization of N is more natural that way IMO.
Depends on where you are! In some places it is more common to say that 0 is natural and in other's not. Some argue it's useful to have it N, some, say that it makes more historical and logical sense for 0 not to be in N and use N_0 when including it. It's not a settled issue, it's a matter of perspective.
When you study CompSci (depending on where IG) you tend to see them that way when trying to mathematically prove something about an algorithm. It's only really a good way of thinking if you're into coding, but I don't think a teacher for a non-coding related algebra class should show this, it can be really confusing for some people.
Lua is just a based language. It has strong unpopular opinions and doesn't care what you think.
the "It's a (t/w)rap!" is really smart