Few people realize how insane the current AI situation is — companies are racing to try to make minds smarter than all current people (they are spending more than $100B/yr on it, and more each year); experts, and company leaders, *say* this effort has a more than 10% chance of causing human extinction; and … companies are still doing it, fast.

The authors argue (in a readable way, and with deep intellectual integrity and attention to detail) that this tech is more like 100% likely to kill us, and that we therefore shouldn’t build it.

I don’t say this sort of thing often — I usually try not to be involved in political advocacy or conversation, lest it make all of us scared or crazy. But for this one issue: please read it, if you care what happens for human life on Earth. And please bring in normal common sense and talk about it with your friends. Don’t let companies hypnotized by science and power decide the whole future for all of us forever, at least not without noticing first.

(I also liked the narrator; he made it easy for me to understand the content, and to be not too jostled by it emotionally. I had an easier time understanding the audiobook than the written book, but I personally often prefer audio content.)