Researching Causality and Safe AI at Oxford.
Previously, founder (with help from Trike Apps) of the EA Forum.
Discussing research etc at https://twitter.com/ryancareyai.
How is EAIF performing in the value proposition that it provides to funding applicants, such as the speed of decisions, responsiveness to applicants' questions, and applicants' reported experiences? Historically your sister fund was pretty bad to applicants, and some were really turned off by the experience.
I guess a lot of these faulty ideas come from the role of morality as a system of rules for putting up boundaries around acceptable behaviour, and for apportioning blameworthiness moreso than praiseworthiness. Similar to how the legal system usually gives individuals freedom so long as they're not doing harm, our moral system mostly speaks to harms (rather than benefits) from actions (rather than inaction). By extension, the basis of the badness of these harms has to be a violation of "rights" (things that people deserve not to have done to them). Insofar as morality serves as a series of heuristics for people to follow, having a negativity-bias and action-bias are not necessarily wrong. It causes problems, however, if it this distorted lens is used to make claims about intrinsic right and wrong, or the idea that non-existence is an ideal.
Another relevant dimension is that the forum (and Groups) are the most targeted to EAs, so they will be most sensitive to fluctuations in the size of the EA community, whereas 80k will be the least sensitive, and Events will be somewhere in-between.
Given this and the sharp decline in applications to events, it seems like the issue is really a decrease in the size of, or enthusiasm in the EA community, rather than anything specific to the forum.
It sounds like you would prefer the rationalist community prevent its members from taking taboo views on social issues? But in my view, an important characteristic of the rationalist community, perhaps its most fundamental, is that it's a place where people can re-evaluate the common wisdom, with a measure of independence from societal pressure. If you want the rationalist community (or any community) to maintain that character, you need to support the right of people to express views that you regard as repulsive, not just the views that you like. This could be different if the views were an incitement to violence, but proposing a hypothesis for socio-economic differences isn't that.
In my view, what's going on is largely these two things:
[rationalists etc] are well to the left of the median citizens, but they are to the right of [typical journalists and academics]
Of course. And:
biodeterminism... these groups are very, very right-wing on... eugenics, biological race and gender differences etc.-but on everything else they are centre-left.
Yes, ACX readers do believe that genes influence a lot of life outcomes, and favour reproductive technologies like embryo selection, which are right-coded views. These views are actually not restricted to the far-right, however. Most people will choose to have an abortion when they know their child will have a disability, for example.
Various of your other hypotheses don't ring true to me. I think:
Nice, I'll look forward to reading this!