I appreciate all the responses to this and have been thinking about my negative emotional reaction to the article . It might be related to my need for trust. I'd like to think that a writer has thought about her text. Sure, dishing out statistics can be done quite accurately by an algorithm, but I'm somewhat horrified to think that I'm reading a novel or opinion piece generated by algorithms. I hang up on robocalls, I delete robot-generated spam (which is most likely all spam). Irrationally perhaps, I want to know that a human consciousness, a sentient being, has written a personal essay or editorial. The possibility that everything I read has no human author creates distrust between me and texts -- The whole question of robot-authored texts somehow reminds me of 42 - the answer to the question of life, the universe, and everything. Because a computer came up with the answer, it made no sense to the humans. and yes, wasoxygen, Barthes says we should judge a text on its own merits regardless of how it was authored -- but that's if we are engaged in the text and want to judge it, read it, understand it, identify with it. We become engaged because we somehow trust that it has a message for us. To those who want messages from robots, that's fine. Robots might well have something to say. I loved the SNL clip pseydtonne.