Dumbing down of technology and competency over the past decade has led to a number of situations where sophisticated users are actually the ones being marginalised.
- PDFs are being wrapped with some exclusive JavaScript garbage that only works for GUI users. Terminal users are losing the ability to simply download PDFs. Links to files with a “.pdf” extension are often not actually PDFs anymore - they are HTML w/js embedded masquerading as PDFs. Mozilla is on board with this deception.
- Ethernet: public libraries have disabled ethernet ports, mostly. Some librarians even go apeshit when someone plugs into them (not understanding that it’s another way for wifi-less people to connect). You either subscribe to mobile phone service & disclose your number to pass the captive portal’s verfication, or you can fuck off, as far as the library is concerned. And yes, people are generally okay with /public/ libraries excluding people this way.
- The value of compatibility is totally lost. Young network admins just assume everyone runs the same latest browser as them, and that everyone has a recent model smartphone. If you don’t buy a new phone every couple years, they believe it’s your fault you’re excluded. The concept of design and engineering for compatibility is a lost competency. The word “compatibility” is becoming history despite the decline of interoperability. Soon dictionaries will tag the entry for “compatibility” with “(rarely used)”.
- We can no longer access public services like court system search tools, business registries, and public libraries book catalogs from a text terminal. The drive to dumb everything down has led to fancy UIs that work with fewer clients.
- Access restrictions block access to resources unless you have a non-Tor IP address. Sophisticated users know better than to expose their personal IP addresses while also exposing to their ISP where they go. Sophisticated users are in such a small minority that it’s trivial to oppress them.
- Using asymetric encryption to protect email payloads was a thing in the 90s. Who predicted that we would /devolve/ to 100% in-the-clear email payloads ~25 years later?
There are a lot more examples but to cut to the chase: How did we fuck this up?
Instead of teaching users to become sophisticated, as a society we just threw in the towel and decided we cannot teach people.. that they cannot even learn the speed and utility of terminals and keyboards. So we said “fuck it, give everyone a GUI and a mouse”. And so now we are at a point where even the technicians themselves seem to be helpless without a GUI and mouse, so they are oblivious to the demographic of users who are slowed down by their UIs.
Then we decided: since everyone has a GUI and a mouse, throw graphical CAPTCHAs their way. Surely no one uses terminals anymore, right? And why stop there.. get rid of documents (simple HTML).. make every webpage an /application/ instead, because surely everyone can run any random JavaScript we shoot their way.
This is not to say low tech users should be left behind. Indeed some people are truly incapable of terminals, scripting, Tor, PGP, etc. The problem is catoring for the tech illiterates exclusively results in disempowering sophisticated users.
It parallels the situation where classroom instruction moves so slow for some of the faster learners at the top of the class that they get bored and drop out of school, and waste their potential. I’m at a point where I’m fighting to retain an analog life because the digital workflows being pushed on us are so dumbed down that I just cannot accept being forced to click through shitty oppressive technology that forces interaction with tech giants and walled-gardens.
If I could choose between broadband with today’s garbage (ads, CAPTCHAs, Cloudflare, anti-bot, anti-tor, …) and 9600 baud dial-up to garbage-free text services that just work, I would seriously choose the latter. I am serious about that.
Thanks. I’ll have a look at some of those approaches.
(edit) I used a feature in the KOMAscript pkg to produce circles that reach the edge of the paper. I also used one of the approaches in your link to create a frame at the point where the /expected/ boundary is, so that if the frame has any missing lines it would indicate where the specs may be wrong. But I must say I don’t trust LaTeX to produce an accurate frame because some lines are closer to the edge than others even though I asked for 4.2mm on all sides.