Monday, June 16, 2014

Be wary of news bearing breakthroughs

Last week, we had the big news of the Turing test being beaten.  From Wikipedia, "The Turing test is a test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human." It was reported that a super computer fooled 30% of people into thinking it was a 13 year old Ukrainian child. A few things wrong, it is not a super computer, it was a program known as a chatbot. Chatbot's are used many many places from legit uses such as answering basic questions on the phone to questionable uses like spam-bots trying to get you to click on a link. So what's the big deal? My first encounter of the Turing test or a chatbot was the famous Eliza. She was a simple program with set responses that would keep leading you on for more information. I did not have a super computer, just an old LC II. Eliza is still running today and is just as strangely fun as I remember her and can be used here. Secondly, this is not a landmark. Chatbots have been fooling people for years now. There have been countless articles about this and other strange tests like having two chatbots talking to each other while each one tries to figure out if the other is a chatbot. Screwy, isn't it? Heck, if you want to talk to a chatbot, try chatbot4u, which is a site that people create bots with preset answers like the Hans Christian Andersen botYes, there really is such a thing. Today, you can read articles about this "ground breaking test" ranging from titles as wide as, A computer just passed the Turing Test in landmark trial to Why The Turing Test Is Bullshit. The reason I write this is to do two things, to inform you about what a Turing test is and the fact that we should question all "breakthroughs" much more closely. Lastly, let me recommend listening to this interview, from On The Media, a great podcast by the way, about the news media's failure to ask the right questions.
If you're feeling down after the interview, try out Eliza. She is very calming.

No comments:

Post a Comment