I would. It's all how they're made. It's made specifically to _look like someone's thinking_ without any of the actual thought, which is why it frequently turns out to be wrong about things, often in ways that only the people who really know the subject will detect. It's the ultimate bullshitter.
Yep. It was mostly hype that got us to this point. The people who made the "AIs" (because they're not really intelligent) had a financial interest in them seeming powerful and spooky, it made them seem more valuable. The more examples I see, the more it strikes me that they're really not all that different from a simple Markov text generator in ability, just with a very large corpus and a large text buffer. I've been sure that there had to be more to them than that, but geez, I keep being proven wrong.
Serving coffee on aircraft causes turbulence.