141
Why a kilobyte is 1000 and not 1024 bytes
(zeta.one)
This is a most excellent place for technology news and articles.
1024 is not the standard. The standard term for 1024 is "kibi" or "Ki" and the standard term for 1000 is "kilo" and has been since the year 1795.
There was a convention to use kilo for 1024 in the early days of computing since the "kibi" term didn't exist until 1998 (and took a while to become commonly used) — but that convention was always recognised as an incorrect use of the term. People just didn't care much especially since kilobytes were commonly rounded anyway. A 30,424 byte file is 29.7109375 kibibytes or 30.424 kilobytes... both will likely be rounded to 30 either way, so who cares if it's slightly wrong? Just use bytes if you need to know the exact size.
Also - hard drives, floppy disks, etc have always referred to their size in base 1000 numbers so if you were working with 30KB in the early days of computers it was very rarely RAM. A PDP-11 computer, for example, might have only had 8196 bytes of RAM (that's 8 kibibytes).
There are some places where the convention is still used and it can be pretty misleading as you work with larger numbers. For example 128 gigs equals 128,000,000,000 bytes (if using the correct 1000 unit) or 137,438,953,472 bytes (if kilo/mega/giga = 1024).
The "wrong" convention is commonly still used for RAM chips. So a 128GB RAM chip is significantly larger than a 128GB SSD.
I've never met anyone that actually uses the new prefixes for 1024 and the old prefixes to mean 1000
That is not true. For a long time everything (computer related) was in the base 2 variants. Then the HD manufacturers changed so their drives would appear larger than they actually were (according to everyone's notions of what kn/mb/gb meant). It was a marketing shrinkflation stunt.