I don’t really think so. I am not really sure why, but my gut feeling is that being good at impersonating a human being in text conversation doesn’t mean you’re closer to creating a real AI.

  • SubversivoB
    link
    22 years ago

    The problem is we probe consciousness trough language, and we created machines capable of damm good language. We tend to think language as a byproduct of conscience. If I feel like I’m a being separated from the world, I can use language to order that world. As AI Focus a lot on NLP we have machines capable of using language and describing the world as conscientious beings, but we have no way to tell the difference of a emulated conscience and a real one.