Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Note that some popular effective altruists have a little bit of tunnel vision when it comes to X-risks, which leads them to overfund X-risk-due-to-AI over something like, say, X-risk-due-to-asteroids. (X-risk-due-to-nanotech is being suitably supported by not funding nanotech.)

Plus, some people working on the AI X-risk problem are doing it for nothing, which means the state of funding is a bit weird.



> tunnel vision when it comes to X-risks, which leads them to overfund X-risk-due-to-AI over something like, say, X-risk-due-to-asteroids.

Risks from natural events like asteroids are actually quite well understood and we have tight bounds that they aren't that risky per century. The natural risks cause area probably deserves more funding on a global level, but EAs are definitely thinking about it. It's literally the first section in 80k's intro article about X-risks: https://80000hours.org/articles/extinction-risk/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: