Menu
Google program can automatically caption photos

Google program can automatically caption photos

A neural network-based system scores above average in writing natural captions

“Two pizzas sitting on top of a stove top oven," is how a Google program described this image.

“Two pizzas sitting on top of a stove top oven," is how a Google program described this image.

Next time you're stumped when trying to write a photo caption, try Google.

The search giant has developed a machine-learning system that can automatically and accurately write captions for photos, according to a Google Research Blog post.

The innovation could make it easier to search for images on Google, help visually impaired people understand image content and provide alternative text for images when Internet connections are slow.

In a paper posted on arXiv, Google researchers Oriol Vinyals, Alexander Toshev, Samy Bengio and Dumitru Erhan described how they developed a captioning system called Neural Image Caption (NIC).

NIC is based on techniques from the field of computer vision, which allows machines to see the world, and natural language processing, which tries to make human language meaningful to computers.

The researchers used two different kinds of artificial neural networks, which are biologically inspired computer models. One of the networks encoded the image into a compact representation, while the other network generated a sentence to describe it.

The researchers' goal was to train the system to produce natural-sounding captions based on the objects it recognizes in the images.

NIC produced accurate results such as "A group of people shopping at an outdoor market" for a photo of a market, but also turned out a number of captions with minor mistakes, such as an image of three dogs that it captioned as two dogs, as well as major errors, including a picture of a roadside sign that it described as a refrigerator.

Still, the NIC model scored 59 on a particular dataset in which the state of the art is 25 and higher scores are better, according to the researchers, who added that humans score around 69. The performance was evaluated using a ranking algorithm that compares the quality of text generated by a machine with that generated by a human.

"It is clear from these experiments that, as the size of the available datasets for image description increases, so will the performance of approaches like NIC," the researchers wrote.


Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags internetGooglesearch engines

Featured

Slideshows

EDGE 2018: Kiwis kick back with Super Rugby before NZ session

EDGE 2018: Kiwis kick back with Super Rugby before NZ session

New Zealanders kick-started EDGE 2018 with a bout of Super Rugby before a dedicated New Zealand session, in front of more than 50 partners, vendors and distributors on Hamilton Island.​

EDGE 2018: Kiwis kick back with Super Rugby before NZ session
EDGE 2018: Kiwis assess key customer priorities through NZ research

EDGE 2018: Kiwis assess key customer priorities through NZ research

EDGE 2018 kicked off with a dedicated New Zealand track, highlighting the key customer priorities across the local market, in association with Dell EMC. Delivered through EDGE Research - leveraging Kiwi data through Tech Research Asia - more than 50 partners, vendors and distributors combined during an interactive session to assess the changing spending patterns of the end-user and the subsequent impact to the channel.

EDGE 2018: Kiwis assess key customer priorities through NZ research
Show Comments