Artificial Intelligence would read stories based on Images

A picture is said to be worth thousand words, yes it is, by seeing a picture we can describe a thousand words because of memories or feelings. But have you ever imagined a computer looking at a picture representing it as a human without lessening any emotions and feelings? No one would have a thought on that not even in their dreams.

But soon this may happen as researchers have started teaching specialised programs to describe images same as the humans would. Microsoft Research scientists are developing a new artificial intelligence (AI) system for this to happen. It would enable computers to explain about the images just by looking at them.

Researchers are saying that their aim is not just to make a system which can just tell what items are in the picture, but also what appears to be happening and how it might make a person feel.

Asus Shows off its $599 Household Robot named ZENBO

Margaret Mitchell, a computer researcher  at Microsoft, stated that “The goal is to help give AIs more human-like intelligence, to help it understand things on a more abstract level – what it means to be fun or creepy or weird or interesting.”

The scientists also explained that they are building the Future AI much similar to those used for automated language translation, only difference instead of translating languages it to translate images into words and sentences.

The researchers are training the excellent story telling system by using deep neural networks and computer systems which learn from examples. They are showing several examples of a particular image to identify the same accurately and perfectly. For instance by showing several pictures of cats to make it detect and analyse the picture of a cat.

German Scientists Teaching Robots on ‘How to Feel and Retract Basing on Severity of Pain’

In this way, the researchers are tuning-in their virtual system by giving more than 8,100 new images and testing the stories generated by it.

The soon availing system would be very similar to human regarding feelings, emotions and can further diminish the space between Computer & Humans.

Further details about the project would be disclosed at the annual conference of the North American Chapter of the Association for Computational Linguistics (NAACL) which is scheduled for June 12 to June 17 in San Diego, California.


Please enter your comment!
Please enter your name here