PartiallyApplied
Mark Rober sold out.
He’s watered down the actual interesting science in exchange for a more hype style. A Mr. Beast wanna-be.
Good lecturers exude a passion and elegance of the subject matter. Simplicity in complexity.
Mark Rober is a shell of the educator he once was. I would kill to have his inspirational pull. And yet, he placed such a talent on the altar of vacant consumerism
This github page isn’t visible on my mobile device because the ads block the view.
The concept sounds truly interesting but distribution is everything. The AdSense you have is probably not very profitable and is actively hurting your recognition. As politely but bluntly as possible: If you want appreciation and adoption, remove the advertisements. You’re selling us on your idea, not whatever bottom barrel consumerism Google wants me to buy
Behold: PRQL. I only know it exists not if the errors are good, my SQL needs are simple, but perhaps for some complex data wrangling it could be nicer idk
Perhaps the textbook example is the Simpson’s Paradox.
This article goes through a couple cases where naively and statically conclusions are supported, but when you correctly separate the data, those conclusions reverse themselves.
Another relevant issue is Aggregation Bias. This article has an example where conclusions about a population hold inversely with individuals of that population.
And the last one I can think of is MAUP, which deals with the fact that statistics are very sensitive in whatever process is used to divvy up a space. This is commonly referenced in spatial statistics but has more broad implications I believe.
This is not to say that you can never generalize, and indeed, often a big goal of statistics is to answer questions about populations using only information from a subset of individuals in that population.
All Models Are Wrong, Some are Useful
- George Box
The argument I was making is that the NYT will authoritatively make conclusions without taking into account the individual, looking only at the population level, and not only is that oftentimes dubious, sometimes it’s actively detrimental. They don’t seem to me to prove their due diligence in mitigating the risk that comes with such dubious assumptions, hence the cynic in me left that Hozier quote.
“Wet sidewalks cause rain”
Pretty much. I never really thought about the causal link being entirely reversed, moreso that the chain of reasoning being broken or mediated by some factor they missed, which yes definitely happens, but now I can definitely think of instances where it’s totally flipped.
Very interesting read, thanks for sharing!
I feel this hard with the New York Times.
99% of the time, I feel like it covers subjects adequately. It might be a bit further right than me, but for a general US source, I feel it’s rather representative.
Then they write a story about something happening to low income US people, and it’s just social and logical salad. They report, it appears as though they analytically look at data, instead of talking to people. Statisticians will tell you, and this is subtle: conclusions made at one level of detail cannot be generalized to another level of detail. Looking at data without talking with people is fallacious for social issues. The NYT needs to understand this, but meanwhile they are horrifically insensitive bordering on destructive at times.
“The jackboot only jumps down on people standing up”
- Hozier, “Jackboot Jump”
Then I read the next story and I take it as credible without much critical thought or evidence. Bias is strange.
I think many people don’t like it conceptually because the advertising for Brave is:
Built in Privacy + Crypto + Ad Blocking
Firefox + uBlock Origin suffices well enough for most people. It’s stable, suits the purpose, and separates them from a company entangled with crypto.
Everyone is just trying to do their best to balance convenience with the social impacts of their actions. People make change because they care, either altruistically or personally, but it always comes with some sort of personal cost. Putting your neck out there and trying to make a change is more important than any specific browser choice
I’ve done a bit more searching and it seems ltex-lsp-plus is the best out there for lsp grammar checking. It’s 1000x better than nothing, though the false negative rate is a bit high for my taste :)
I’m not sure what kind of diagram you’re after, but Typst has Cetz which is graphing + arbitrary drawing of shapes, paths, splines, etc.
Typst also has fletcher “maker of arrows” for diagrams which is my personal fave for the work I do
Word definitely has its niche.
However, I find for many of my tasks, LaTeX or Typst just make sense. I don’t need to worry about out of date figures. I can customize styling instantly. I can track my changes with Git. Grammar checking is rough tho. lsp-like grammar checking would revolutionize my world lol.
I can personally attest that I transitioned to LaTeX from Word, when Word wouldn’t handle equations correctly, or would crash when I had too many. It doesn’t matter if I can put out 50 word equations faster than LaTeX if I’m breaking my flow state to restart my editor.
They overlap in their ecosystem niches but in no way is one a complete replacement for the other. LaTeX has a larger niche than Word which makes it a really safe default.
“Nobody ever got fired for choosing React”