The Linux Foundation Projects
Skip to main content

Existential Risk

The hypothesis that substantial progress in artificial general intelligence (AGI) could someday result in human extinction or some other unrecoverable global catastrophe.