Robots learn to read your lips.

In AI-Artificial Intelligence, Brain Technology, Communications, Robots by Brainy Days Ahead

Two studies, both from researchers at the University of Oxford’s Department of Computer Science take different approaches in guiding robots to “read” speech and discern meaning without hearing a word. One team has developed a new artificial-intelligence system called LipNet that can be used to discern speech by lip-reading from silent video clips more effectively than professional lip-readers can. The study uses a data set known as GRID, which is made up of well-lit, face-forward clips of people reading three-second sentences. A second team has been working with Google DeepMind. using a series of 100,000 video clips taken from BBC television. These videos have a much broader range of language, with far more variation in lighting and head positions. Both experiments show AI vastly outperforming humans at lip-reading.