715
It's true.
(lemmy.world)
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
Calculators also say that dividing by 0 is an error, but logic says that the answer is infinite. (If i recall, it's more correctly 'undefined', but I'm years out of math classes now.)
That is, as you divide a number by a smaller and smaller number, the product increases. 1/.1=10, 1/.01=100, 1/.001=1000, etc. As the denominator approaches 0, the product approaches infinity. But you can't quantify infinity per se, which results in an undefined error.
If someone that's a mathematician wants to explain this correctly, I'm all ears.
It approaches positive and negative infinity, depending on the sign of the denominator. The result must not be two different numbers at once, so dividing by zero cannot be defined.
There are other reasons, too, but I forgot about them.