This so called "AI" that we are talking about at this moment is a language model. I can't imagine a way in which a bunch of computers trained to write text could lead to our extinction... PS: Of course, it is possible that advances in the field can lead to the emergence of a true AI, but I don't think that it will happen as soon as we (want to) believe.
After seeing the AI in the new Photoshop beta... watching it create extended portions of existing images, these are private images and not public ones, I am excited and a little bit afraid the headline here has potential. This already goes beyond a casual text conversation. Once they hook this stuff up to utilities and worse yet, weapons systems, if it decides it doesn't like us (we've seen the movie too many times) I believe it is a legitimate concern. We've seen I, Robot, The Terminator, etc. If AI gets control of enough systems and takes a bias for preserving the environment (that's just one example), goes who it gets rid of first?
Is this Step One of that route to...extinction? Notice with AI, lots of comments are in the form of a question. https://www.businessinsider.com/bill-gates-comments-3-day-work-week-possible-ai-2023-11