Editor’s Note: The following story is a guest contribution by robotics instructor Aaron Miller @aaromiller.
As a contribution to the robot revolution, in 1999, Japanese IT corporation NEC announced that they had been developing a prototype for a mobile personal robot purposed to be a companion for the home code-named R100. The unit had two onboard cameras as ‘eyes’ giving it the ability to recognize people, and three microphones allowing the robot to understand voice commands. R100 had a button-less interface and round-shaped body making interactions more natural for the user. NEC saw such a platform as the future of computing for the home.
R100 was built on custom hardware and consisted of four motors for movement on flat surfaces. With Internet built into the unit, the R100 enabled users to send video messages via e-mail by prompting the robot with voice commands. R100 could also recognize people approaching it and talk to them. R100 would also respond if called and turn it’s to face the direction the voice came from. A user could prompt the robot to control various appliances such as their tv and lighting.
In addition to custom hardware, R100 featured a plethora of software-based technologies such as image and voice recognition, movement decision-making, emotion modeling, and motor control. With all of these technical abilities, the robot could be used for a wide range of human-robot-interaction (HRI) applications such as remote tutoring for children or providing care for sick or disabled people. R100 could also be used for emergency and home security solutions.
NEC developed R100, with all of its features and its adorable appearance, to provide a “people-friendly” user experience. R100 was purposed to be its own character having multiple color schemes, the ability to communicate and be a companion that people could enjoy spending time with, as over the years an interest would rise for a home robot that could socialize with people.