I was wondering if someone here has a better idea of how EA developed in its early days than I do.
Judging by the link I posted, it seems like Yudkowsky used the term “effective altruist” years before Will MacAskill or Peter Singer adopted it. The link doesn’t mention this explicitly, but Will MacAskill was also a lesswrong user, so it seems at least plausible that Yudkowsky is the true father of the movement.
I want to sort this out because I’ve noticed that a recently lot of EAs have been downplaying the AI and longtermist elements within the movement and talking more about Peter Singer as the movement’s founder. By contrast the impression I get about EA’s founding based on what I know is that EA started with Yudkowsky and then MacAskill, with Peter Singer only getting involved later. Is my impression mistaken?
Eh, the impression that I get here is that Eliezer happened to put “effective” and “altruist” together without intending to use them as a new term. This is Yud we’re talking about - he’s written roughly 500,000 more words about Harry Potter than the average person does in their lifetime.
Even if he had invented the term, I wouldn’t say this is a smoking gun of how intertwined EAs are with the LW rats - there’s much better evidence out there.
Oh my god. I should have realised of course teenage Yud would be Like This, but looking through this archive is a trip.
He has quite possibly written more words about Harry Potter than She Who Shall Not Be Named, herself.
Thank you, that link is exactly what I was looking for (and also sated my curiosity about how Yudkowsky got involved with Bostrom and Hanson, I had heard they met on the extropian listserv but I had never seen any proof).
EA as a movement was a combination of a few different groups (This account says Giving what we can/80000 hours, Givewell, and yudkowsky’s MIRI). However, the main source of early influx of people was the rationalist movement, as Yud had heavily promoted EA-style ideas in the sequences.
So if you look at surveys, right now a a relatively small percentage (like 15%) of EA’s first heard about it through lesswrong or SSC. But back in 2014, and earlier, Lesswrong was the number one onroad into the movement (like 30%) . (I’m sure a bunch of the other answers may have heard about it from rationalist friends as well). I think it would have been even more if you go back earlier.
Nowadays, most of the recruiting is independent from the rationalists, so you have a bunch of people coming in and being like, what’s with all the weird shit? However they still adopt a ton of rationalist ideas and language, and the EA forum is run by the same people as Lesswrong. It leads to some tension: someone wrote a post saying that “yudkowsky is frequently confidently, egregiousl wrong”, and it was somewhat upvoted on EA forum but massively downvoted on Lesswrong.
yeah, he totally did. EAs claiming otherwise are just incorrect.
remember that the original Roko’s Basilisk post talked about the dilemma of being an “altruist”, i.e. how to donate as much money as possible to MIRI (or SIAI as it was in 2010).
the terms were in extremely heavy use back then.
a bunch of the various Singer-inspired groups tried to work out a name for the whole thing, and they picked Yudkowsky’s coinage.
downplaying the AI and longtermist elements within the movement
and this has always been an issue - they’re a fucking embarrassment, but they also do a lot of the organisational work so it’s hard to get rid of them
This is gonna be really helpful next time someone tells me straight up that EA and Rationalist are totally different things and just overlap by coincidence.
From what I remember, EA was always popularized by rationalist culture for like two decades.
Speaking of Big Yud, has there been any new developments with him since the fallout of his Time Magazine article in March? Not like there’s been any international coalition to first strike rogue data centers or whatever. Everyone just kind of ignored him?
He’s been doing interviews on podcasts. The NYT also recently listed “internet philosopher” Eliezer Yudkowsky as one of the key figures of the modern artificial intelligence movement.
Well you need motive and the ability to follow up on your threats(*) for it to be a real threat and he certainly lacks the latter. We also have a tendency to ignore the people who have the former while they do not have the latter until it is too late. (Some of my recent Dutch elections doomerism might be influencing this message).
*: if you are privileged in society.