This Arduino Toy Robot Acts As A Robotic 'Parent'

Special thanks to Shan Shen for writing this guest post on her remote parenting robot.



There are two problems coming up in an age of ubiquitous screens. First, remote parenting seems to be interpreted as supervisory control from the children's side. Second, traditional toys are replaced by digital devices when children are spending more time playing video games and checking their Facebook. Then this question comes to us, does it make a difference when remote parenting is indirectly implemented through some other remote interactions? For instance, instead of parenting through screen chatting, how about parenting through toy robots? Inspired by this, I made this robot which can be used as a mediation to help parents outsource parenting control and hide up their ubiquitous authority. 


The robot is settled up with two distinctive behavior modes. On one hand, it's motion sensitive and reacts to ambience. That means, it responds immediately when a child approaches and then it starts to rotate itself. On the other hand, once the parent poses control over the robot through the controlling panel remotely, the robot will be further activated. In this case, the second behavior mode will override the old one and becomes the dominant one. The robot will move forward at various speeds to entertain the child.


Basically the mechanism of the robot is a combination of motion sensors and servo motors. First two PIR sensors will detect motions from the ambience. Then four servo motors can respond to the PIR sensors and listen remotely to the controlling panel through Cloud Services as well. The LEDs are served as time indicator that communicates the activated mode of the robot.


What’s more, the controlling panel is designed for parents. There are three parameters on the panel: switch, speed and time. Specially, the linearly aligned LEDs will show how much time the parent would be controlling the robot. Once the LEDs on the controlling panel are lighted up, the LEDs on the robot body will be responsively lighted up at the same time. 



I used two Arduino boards as the micro controllers for the bot and the controlling panel. In order to make the two Arduino boards talk to each other, I combined Netlab Toolkit (developed by the core faculty Phil Van Allen from Art Center College of Design) and Cloud Services together. The analog values from the knob embedded in the controlling panel will be sent out to Could Services(xively) and be used as the input value to activate the robot's second behavior mode.