78
A Century of “Shrill”: How Bias in Technology Has Hurt Women’s Voices
(www.newyorker.com)
Feminism, women's rights, bodily autonomy, and other issues of this nature. Trans and sex worker inclusive.
See also this community's sister subs LGBTQ+, Neurodivergence, Disability, and POC
Also check out our sister community on lemmy:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Thanks, that's a fascinating read, especially the part about the decisions which distorted women's voices. It reminds me a bit of the choices made with early colour film stock, which was so bad at dark skin tones.
I'll attest to most modern day cameras still being bad at exposing dark skin tones correctly.
Especially in the prosumer & professional camera world where intelligent exposure algorithms are kind of ignored, lighting, exposure, and white balance algorithms destroy dark skin tones. You have to very consciously & purposely ignore the primitive default algorithms to take any decent pictures of darker skinned people - it takes extra time you may not have or have not planned for (or might not even know how to compensate for if you've never had to take a great photo of a dark skinned person).
In fact, all my cameras from the late 2010s have facial recognition, but none of them reliably detect my dark skinned friend when the background light source is extremely bright (causing his face to become a black blob). Funny enough, the best way to accurately take a picture of his face seems to be to use the most primitive algorithm available called "center weighted average". This is the same algorithm used by my 70s camera with no electronics.
Phones are where this technology has made leaps and bounds in the last 5 years - the AI processing capability and massive online database of pictures have put phones miles ahead of traditional cameras for "no knowledge required" photography of dark skin tones.