Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)


TracingWoodgrains’s hit piece on David Gerard (the 2024 one, not the more recent enemies list one, where David Gerard got rated above the Zizians as lesswrong’s enemy) is in the top 15 for lesswrong articles from 2024, currently rated at #5! https://www.lesswrong.com/posts/PsQJxHDjHKFcFrPLD/deeper-reviews-for-the-top-15-of-the-2024-review
It’s nice to see that with all the lesswrong content about AI safety and alignment and saving the world and human rationality and fanfiction, an article explaining about how terrible David Gerard is (for… checks notes, demanding proper valid sources about lesswrong and adjacent topics on wikipedia) won out to be voted above them! Let’s keep up our support for dgerard!
Picking a few that I haven’t read but where I’ve researched the foundations, let’s have a party platter of sneers:
Yud’s ramblingsfirst principles.To add to your sneers… lots of lesswrong content fits you description of #9, with someone trying to invent something that probably exists in philosophy, from (rationalist, i.e. the sequences) first principles and doing a bad job at it.
I actually don’t mind content like #25 where someone writes an explainer topic? If lesswrong was less pretentious about it and more trustworthy (i.e. cited sources in a verifiable way and called each other out for making stuff up) and didn’t include all the other junk and just had stuff like that it would be better at its stated goal of promoting rationality. Of course, even if they tried this, they would probably end up more like #47 where they rediscover basic concepts because they don’t know how to search existing literature/research and cite it effectively.
45 is funny. Rationalists and rationalist adjacent people started OpenAI, ultimately ignored “AI safety”. Rationalist spun off anthropic, which also abandoned the safety focus pretty much after it had gotten all the funding it could with that line. Do they really think a third company would be any better?
Wonder if that was because it basically broke containment (still was not widely spread, but I have seen it at a few places, more than normal lw stuff) and went after one of their enemies (And people swallowed it uncritically, wonder how many of those people now worry about NRx/Yarvin and don’t make the connection).
The #5 article of the year was a crock of a few kinds of shit, and I have already spent too much time thinking about why