Put simply, if they are right, we are talking extinction level event for humanity. If they are wrong, then maybe the world develops AI a little bit slower. There is a lot more risk in the former than the latter.
We don’t know how to craft AI, so we grow them and hope that they do what we want. Experts don’t even understand how it works to be able to create safety mechanisms.
Humanity is allowing corporations to play with bestowing godhood upon an ever increasing alien mind / alien intelligence which is so different from our human thinking that we are unable to grasp how it thinks and what it wants and how it would go about it.
