To have the confidence of a white CS undergrad...
Who tf is this?
"How Batman Launders His Grudges Into the Public Record" by Penguin's Henchman #37, like dude, I spend way too much time sneering on yall and I've still never heard of mr Turdgrains or whatever.
In any case, whoever this is, @dgerard, you should start charging him rent for the priviledge of having you live in his head.
Texas counties can’t pass their own ordinances, only cities can
We hate central big government so we have eliminated all the local small governments as a precaution.
non-scam crypto twitter
Mythical Places and How to Find Them
It's called a function plot for a reason!
Also, speaking from experience trying to do any database work for large corporate clients, no data house is in order. It's basically saying "assume a spherical cow, then AI works".
Google Cloud Chief Evangelist
That cannot be an official title someone has, can it?
FML I just caught it on the screenshot of the "iOS Development" course offered by Lambda
BLOCKCHAIN! Those bozos had to squeeze 4yrs of CS education into 6-12 months but they found ample room to teach about BLOCKCHAIN! There's literally nothing in that curricullum about datrabases, but they somehow fit BLOCKCHAIN.
Also "Hash Tables and Blockchain" is like having a physics module called "Gravity and Juiceros"
"Nah" is a great reaction to any wall of text by this bozo, really.
It can’t be that hard
woo boy
I'm not even going to engage in this thread cause it's a tar pit, but I do think I have the appropriate analogy.
When taking certain exams in my CS programme you were allowed to have notes but with two restrictions:
- Have to be handwritten;
- have to fit on a single A4 page.
The idea was that you needed to actually put a lot of work into making it, since the entire material was obviously the size of a fucking book and not an A4 page, and you couldn't just print/copy it from somewhere. So you really needed to distill the information and make a thought map or an index for yourself.
Compare that to an ML model that is allowed to train on data however long it wants, as long as the result is a fixed-dimension matrix with parameters that helps it answer questions with high reliability.
It's not the same as an open book, but it's definitely not closed book either. And the LLMs have billions of parameters in the matrix, literal gigabytes of data on their notes. The entire text of War and Peace is ~3MB for comparison. An LLM is a library of trained notes.
How are we still doing this, how