[-] Collectivist@awful.systems 4 points 3 weeks ago

This is another example of the dangers of wealth inequality. A lot of EAs tried to start a youtube channel (e.g.), but the only one that could get funded was this one, the one promoting bitcoins and charter cities. Now this is the largest EA channel, attracting more of those types and signalling clearly that if you want to succeed in EA you gotta please the capitalist funders.

[-] Collectivist@awful.systems 3 points 1 month ago

I read the article, not a single mention of things like the research on stereotype threat in chess. I wish rationalists would crack open a sociology book at some point in their lives. They're so interested in social phenomena, but while Less Wrong has a tag for psychology (with 287 posts), history (245 posts), and economics (462 posts), they seem unwilling to look at sociology for explanations, with it not even having a tag on LW.

[-] Collectivist@awful.systems 3 points 8 months ago

No no, not the term (my comment is about how he got his own term wrong), just his reasoning. If you make a lot of reasoning errors, but two faulty premises cancel each other out, and you write, say, 17000 words or sequences of hundreds of blog posts, then you're going to stumble into the right conclusion from time to time. (It might be fun to model this mathematically, can you err your way into being unerring?, but unfortunately in reality-land the amount of premises an argument needs varies wildly)

[-] Collectivist@awful.systems 3 points 8 months ago

Zack thought the Times had all the justification they needed (for a Gettier case) since he thought they 1) didn't have a good justification but 2) also didn't need a good justification. He was wrong about his second assumption (they did need a good justification), but also wrong about the first assumption (they did have a good justification), so they cancelled each other out, and his conclusion 'they have all the justification they need' is correct through epistemic luck.

The strongest possible argument supports the right conclusion. Yud thought he could just dream up the strongest arguments and didn't need to consult the literature to reach the right conclusion. Dreaming up arguments is not going to give you the strongest arguments, while consulting the literature will. However, one of the weaker arguments he dreamt up just so happened to also support the right conclusion, so he got the right answer through epistemic luck.

[-] Collectivist@awful.systems 4 points 9 months ago

I spend a lot of time campaigning for animal rights. These criticisms also apply to it but I don't consider it a strong argument there. EA's spend an estimated 1.8 million dollar per year (less than 1%, so nowhere near a majority) on "other longterm" which presumably includes simulated humans, but an estimated 55 million dollar per year (or 13%) on farmed animal welfare (for those who are curious, the largest recipient is global health at 44%, but it's important to note that it seems like the more people are into EA the less they give to that compared to more longtermist causes). Farmed animals "don’t resent your condescension or complain that you are not politically correct, they don't need money, they don't bring cultural baggage..." yet that doesn't mean they aren't a worthy cause. This quote might serve as something members should keep in mind, but I don't think it works as an argument on its own.

[-] Collectivist@awful.systems 3 points 11 months ago

When the second castle (bought by ESPR with FTX-money) was brought up on the forum, Jan Kulveit (one of the main organizers of ESPR) commented:

Multiple claims in this post are misleading, incomplete or false.

Then never bothered to actually explain what the misleading and false claims actually were (and instead implied the poster had doxxed them). Then under the post this thread discusses he has the gall to comment:

For me, unfortunately, the discourse surrounding Wytham Abbey, seems like a sign of epistemic decline of the community, or at least on the EA forum.

I guess Jan doesn't think falsely implying the person who is critical of your chateau purchase is both a liar and a doxxer counts as 'epistemic decline'.

[-] Collectivist@awful.systems 4 points 11 months ago

It's also a way for the rich to subvert the democratic will of the people:

Let's say the people of Examplestan have a large underclass who live paycheck to paycheck and a small upperclass who gets their money from land ownership. The government is thinking of introducing a bill that would make their tax revenue come less from paychecks and more from taxing land value. Democracy advocates want to put it to a vote, but a group of futarchy lobbyists convince the government to run a conditional prediction market instead. The market question is "If we replace the paycheck tax with a land value tax, will welfare increase?". The large underclass has almost no money to bet that it will, while the small upperclass bets a large chunk of their money that it won't. Predictably, more money is betted on it not increasing welfare and when the market closes, everyone gets their money back and the government decides not to implement it.

[-] Collectivist@awful.systems 3 points 11 months ago

Since refusing a bet is seen as an admission of dishonesty, it's also a way to disadvantage an interlocutor with less money:

The marginal value of money decreases as you get more of it. A hundred dollars might be a vitally important amount of money for a poor person, and not even noticeable for a rich person. So if you bet against a person with less money you are wagering less of your happiness than they are. If they have health problems (and live in a country with bad healthcare) this bet increases their risk of death, which it doesn't for you. It seems to me that betting against someone who is poorer than you is morally dubious.

[-] Collectivist@awful.systems 3 points 11 months ago* (last edited 11 months ago)

No mention of the second castle either. And then Jan Kulveit says in this comment section:

For me, unfortunately, the discourse surrounding Wytham Abbey, seems like a sign of epistemic decline of the community, or at least on the EA forum.

While lying through his teeth in his comments on the post about the second castle.

[-] Collectivist@awful.systems 4 points 11 months ago

They are now starting to get favorably cited on the EA Forum too:

Lynn and Vanhanen collected IQ scores from various studies and made corrections, such as adjusting for the FLynn Effect, , to produce their national estimates.

When a commenter cites a wikipedia page which shows that Lynn is 1) a self-described scientific racist who systematically picked datasets which gave black people lower IQ, and 2) It's called the Flynn effect, not the FLynn effect, since Lynn didn't discover it, he responds

A side point, but Wikipedia is politically biased. I intentionally capitalized the L to give credit as Richard Lynn's discovery preceeded Flynn's first publication. Although, his discovery was preceeded by Runquist.

[-] Collectivist@awful.systems 3 points 11 months ago* (last edited 11 months ago)

The incel apologetics posts at least tend to present themselves as one degree removed by being 'backlash to the backlash' (recent example), it's the comments that tend to get truly unhinged:

Nearly all of my sexual and relationship success involved an unmistakable element of RPing Neutral Evil.

But incels are defined by their failure to perform well in these games, and they usually have innate (genetic, personality defects) that make them easy targets for abuse (see what feminists like the ones quoted in this piece have to say about them).

view more: ‹ prev next ›

Collectivist

joined 11 months ago