I think we completely agree – there can be both a 20% threat of extinction and also the threat of climate change
no, I don’t agree with this; that’s like saying the threat of the asteroid is used to supplant the threat of climate change. The X-risk threat of AI does not invalidate the other threats of AI, and I disagree with anyone who thinks it does. I have not seen anyone use the X-risk threat of AI to invalidate the other threats of AI, and I implore you to not let such an argument sway you for or against either of those threats, which are both real (!).
I do not blame the gun, I blame the manufacturer. I am calling for more oversight over AI companies and for people who research AI to take this threat more seriously. If an AI apocalypse happens, it will of course be the fault of the idiotic AI development companies who did not take this threat seriously because they were blinded by profits. What did I say that made you think I was blaming the AI itself?
On each of your paragraphs:
I think we completely agree – there can be both a 20% threat of extinction and also the threat of climate change
no, I don’t agree with this; that’s like saying the threat of the asteroid is used to supplant the threat of climate change. The X-risk threat of AI does not invalidate the other threats of AI, and I disagree with anyone who thinks it does. I have not seen anyone use the X-risk threat of AI to invalidate the other threats of AI, and I implore you to not let such an argument sway you for or against either of those threats, which are both real (!).
I do not blame the gun, I blame the manufacturer. I am calling for more oversight over AI companies and for people who research AI to take this threat more seriously. If an AI apocalypse happens, it will of course be the fault of the idiotic AI development companies who did not take this threat seriously because they were blinded by profits. What did I say that made you think I was blaming the AI itself?