

Absolutely. We already sanction Russian oligarchs for the same reasons, why should we treat the American ones any different honestly.
Absolutely. We already sanction Russian oligarchs for the same reasons, why should we treat the American ones any different honestly.
Wow, the term ‘epistemically humble’ is ingenious really. I don’t have to listen to any critics at all, not because I’m a narcissist, oh no, but because I’m so ‘epistemically humble’ no one could possibly have anything left to teach me!
How much money would be saved by just funneling the students of these endless ‘AI x’ programs back to the humanities where they can learn to write (actually good) science fiction to their heart’s content? Hey, finally a way AI actually lead to some savings!
That’s an AI governance PhD right there!
Bill Gates is having a normal one.
https://www.cnbc.com/2025/03/26/bill-gates-on-ai-humans-wont-be-needed-for-most-things.html
Wanting to escape the fact that we are beings of the flesh seems to be behind so much of the rationalist-reactionary impulse – a desire to one-up our mortal shells by eugenics, weird diets, ‘brain uploading’ and something like vampirism with the Bryan Johnson guy. It’s wonderful you found a way to embrace and express yourself instead! Yes, in a healthier relationship with our bodies – which is what we are – such changes would be considered part of general healthcare. It sometimes appears particularly extreme in the US from here from Europe at least, maybe a heritage of puritanical norms.
Reminds me of the stories of how Soviet peasants during the rapid industrialization drive under Stalin, who’d never before seen any machinery in their lives, would get emotional with and try to coax faulty machines like they were their farm animals. But these were Soviet peasants! What are structural forces stopping Yud & co outgrowing their childish mystifications? Deeply misplaced religious needs?
Isn’t it all a bit like Ludic’s writings on software engineering which have been shared approvingly here a number of times? The profession is shit, office politics dominates actual work, most other people are NPCs who instead of moving mountains just go through the motions etc – but I bear the Spirit and dare to stand on higher ground! Or am I dumb and getting stuck in superficial similarities here, discounting the substantive differences?
Incidentally, the only time I’ve seen Tracing Woodgrains pop up in my timeline is retweets from one of the Decoding the Gurus podcast hosts, who had also previously palled around with EA-adjacent ‘intelligence researchers’ like Stuart Ritchie. Something to keep in mind for people who perhaps hold up that podcast with its long-form episodes as a benchmark for debunking IDW crankery.
Where did you get that impression from? He says himself he is not advocating against aid per se, but that its effects should be judged more holistically, e.g. that organizations like GiveWell should also include the potential harms alongside benefits in their reports. The overarching message seems to be one of intellectual humility – to not lose sight that the ultimate aim is to help another human being who in the end is a person with agency just like you, not to feel good about yourself or to alleviate your own feelings of guilt.
The basic conceit of projects like EA is the incredible high of self-importance and moral superiority one can get blinded by when one views themselves as more important than other people by virtue of helping so many of them. No one likes to be condescended to; sure, a life saved with whatever technical fix is better than a life lost, but human life is about so much more than bare material existence – dignity and freedom are crucial to a good life. The ultimate aim should be to shift agency and power into the hands of the powerless, not to bask in being the white knight trotting around the globe, saving the benighted from themselves.
Many ordinary people lost their jobs and homes during the Great Recession, while no one at the top was ever held individually accountable. Knowing the dynamics of the tech world, a hard crash would likely end up playing out the same way.