478
So Far, AI Is a Money Pit That Isn't Paying Off
(gizmodo.com)
This is a most excellent place for technology news and articles.
Automated mail sorting has been using AI to read post codes from envelopes for deacades, only back then - pre hype - it was just called Neural Networks.
That tech is almost 3 decades old.
But was it using neural networks or was it using OCR algorithms?
I love people who talk about AI that don't know the difference between an LLM and a bunch of if statements
At the time I learned this at Uni (back in the early 90s) it was already NNs, not algorithms.
(This was maybe a decade before OCR became widespread)
In fact a coursework project I did there was recognition of handwritten numbers with a neural network. The thing was amazingly good (our implementation actually had a bug and the thing still managed to be almost 90% correct on a test data set, so it somehow mostly worked its way around the bug) and it was a small NN with no need for massive training sets (which is the main difference with Large Language Models versus the more run-off-the-mill neural networks), this at a time when algorithmic number and character recognition were considered a very difficult problem.
Back then Neural Networks (and other stuff like Genetic Algorithms) were all pretty new and using it in automated mail sorting was recent and not yet widespread.
Nowadays you have it doing stuff like face recognition, built-in on phones for phone unlocking...
Very interesting. Thanks for sharing!