A "Nao" humanoid robot, by Aldebaran Robotics that offers basic service information, moves during a presentation at a branch of the Bank of Tokyo-Mitsubishi UFJ (MUFG) in Tokyo April 13, 2015. REUTERS/Thomas Peter/File Photo
Your feedback is important to us!
We invite all our readers to share with us their views and comments about this article.
Disclaimer: Comments submitted by third parties on this site are the sole responsibility of the individual(s) whose content is submitted. The Daily Star accepts no responsibility for the content of comment(s), including, without limitation, any error, omission or inaccuracy therein. Please note that your email address will NOT appear on the site.
Alert: If you are facing problems with posting comments, please note that you must verify your email with Disqus prior to posting a comment. follow this link to make sure your account meets the requirements. (http://bit.ly/vDisqus)
Massachusetts Institute of Technology researchers think her future real-life incarnations can learn a thing or two from Steve Carell and other sitcom stars.MIT says a computer that binge-watched YouTube videos and TV shows such as "The Office," "Big Bang Theory" and "Desperate Housewives" learned how to predict whether the actors were about to hug, kiss, shake hands or slap high fives -- advances that eventually could help the next generation of artificial intelligence function less clumsily.Cue what Vondrick acknowledges were "random videos off YouTube".The researchers downloaded the videos and converted them into visual representations – a sort of numerical interpretation of pixels on a screen that the algorithm could read and search for complex patterns.Humans make the right call 71 percent of the time.Six hundred hours of video sounds like a lot, but it's not really that much.
FOLLOW THIS ARTICLE