Thursday 25 August 2011

Artificial robot can assist human, in particular disable people

I just watched a TV show which is about the way to communicate with disable people whose hearing and speaking sense are lost. To be honest, it is really hard to understand the gesture language by the normal people who have never learned this before. It will be great if the artificial robot serves as a translator.


Nowadays, speaking recognition can help blind people to communicate with some artificial machine, for example, if the blind people want to search something on internet, they just need to speak something and then the contents will be translated into keywords. Object recognition can assist the people with weak vision ability(refer to the LookTel product from IPPLEX), even more it can help normal people to learn new stuff which they never seen before. The similar question is how to help the dumb and deaf people whose vision ability is still ok. As human gesture detection and tracking is a active research field, I wish one day the robot can understand gesture language correctly and then translate them into human understood language and vice verse, which means the robot can be as the bridge between normal people and the disable people.


The possible way I can think is that the robot needs a comprehensive database which stores mapping relationship between gesture symbol and nature language, in addition, it still can learn new mapping information to enrich its capability, which is pretty much like dictionary. And then, image recognition and speaking parse technology should be improved as well.


Hopefully, I would see this kind of robot during my life.

No comments:

Post a Comment