(Top)

Learn why AI safety matters

One of the most important things you can do to help with AI alignment and the existential risk (x-risk) that superintelligence poses, is to learn about it. Here are some resources to get you started.

Videos

Websites

  • AISafety.com . The new landing page for AI Safety. Learn about the risks, the communities, the events, the jobs, the courses, the ideas for mitigating the risks and more!
  • AISafety.world . The whole AI Safety landscape with all the organizations, media outlets, forums, blogs, and other actors and resources.
  • IncidentDatabase.ai . Database of incidents where AI systems caused harm.
  • NavigatingAIRisks.ai . A blog with various interesting articles.
  • PauseAI.info . Check out the rest of the PauseAI site here for loads of related infomation and resources , useful actions , expert quotes , short one-pager flyers , related faqs , etc.

Podcasts

Podcasts featuring PauseAI members can be found in the media coverage list.

Articles

If you want to read what journalists have written about PauseAI, check out the list of media coverage .

Books

Courses

Organizations

If you are convinced and want to take action

There are many things that you can do . Writing a letter, going to a protest, donating some money or joining a community is not that hard! And these actions have a real impact. Even when facing the end of the world, there can still be hope and very rewarding work to do.

Or if you still don’t feel quite sure of it

Learning about the psychology of x-risk could help you.