Facebook Taps Into Artificial Intelligence to Help Blind People See Photos

Facebook announced a new feature that can describe aloud the contents of pictures to the blind and visually impaired, on April 5. Facebook/Courtesy

In recent months, Silicon Valley companies like Microsoft and Twitter have unveiled new technology that is designed to accommodate the blind. On Tuesday, Facebook jumped onboard with automatic alternative text, a new feature that uses artificial intelligence to describe aloud the contents of photos for visually impaired users, such as how many people are in the photo, whether those people are smiling and how many people have liked the photo using Facebook's in-house object recognition technology. Prior to automatic alternative text, a visually impaired person needed a human guide for timeline photos.

Much like the algorithms in Google that beat the world's best Go player in South Korea, Facebook's technology is a self-learning neural network with billions of parameters. The computer learns objects by studying millions of examples.

Facebook estimates that over 285 million people worldwide have some form of visual impairment or are completely blind. As the largest social media platform, Facebook looks to connect them with the rest of the network in being able to enjoy photos, according to a blog post.

"Every day, people share more than 2 billion photos across Facebook, Instagram, Messenger and WhatsApp," write Facebook software engineers Shaomei Wu and Hermes Pique and head of accessibility Jeffrey Wieland. "We want to build technology that helps the blind community experience Facebook the same way others enjoy it."

In addition to the introductory video on automatic alternative text, Facebook also released a video of blind users reacting to the technology. "I feel like I can fit in," says one user. "There's more I can do."

A week ago, Twitter unveiled a similar feature with which human users, not an AI program, can fill out descriptions for photos they tweet. Microsoft—who went all-in on chatbots during the Build 2016 conference last week—showed a video of one of its blind engineers navigating through the streets on London and company meetings with assistance from an AI program connected to his smart glasses and smartphone.