583
What's some really unpopular opinion you have?
(self.asklemmy)
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
Looking for support?
Looking for a community?
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
Interesting. I think joules are dumb and hate seeing them in games like factorio. I'm sure joules are better in applications that don't concern the average person but I'm a random idiot not a scientist at nasa. Just show me kilowatt hours so I don't have to do math in my head. 1 joule is 1 watt second already there's no point in making things more complicated.
I can see the convenience in the Wh unit, because hours are so common and people prefer to think in terms of hours.
Joules and watts look nicer and are naturally fully compatible with each other. Non-technical people usually get very confused with kW and kWh, so switching to joules would create a clearer distinction between energy and power. Many energy production facilities also give their annual energy production in MWh per year, which mixes two time units in the same package!
Let's make a shitty unit trade. We'll change all watt hour units to joules in exchange for completely banning bits per second as a unit of bandwidth speed. Converting megabit per second to the actually usable unit of megabytes per second in my head is far more infuriating than any amount of joule shenanigans. Any takers?
Oh, I totally forgot about the chaos surrounding bits and bytes. Personally, I don’t really care which one we use, as long as its unified. Mixing and matching Mb/s for transfer and MiB for storage is truly infuriating. Yes, even the prefixes need to be unified.
I don’t make programs in low level languages, such as assembly, so I don’t really see the benefit of using base 1024 prefixes. If anyone here can convince me why non-binary prefixes are great, I’m listening. As far as I’m concerned, prefixes should be based on 1000, just like in SI.
Having 8 bits in a byte is just another historical relic, and I see no reason to keep it. Systems have changed many times since the adoption of that term, and back in the early days a single character of text took exactly 8 bits. I guess that was important at the time, but why would it be today? Programmers out there can tell us why we need to have two units for data.