top of page

PUPPET SHOW

Gim the Giraffe

Meet Gim!

gimthegiraffe.jpg
Puppet Show: Text

Overview:

Our assignment was to make a puppet emote and interact with a person in front of it. Our first mode of interaction is the high five sensor. It is made of a Grove ultra-sonic range sensor and detects when a hand (or other object is close to it). When this threshold is triggered, Gim hugs by moving his arms inward, driven by 2 servo motors. Here is a picture of his skeleton without the puppet.

Puppet Show: Text

The secondary mode of interaction is Gim's ability to dance! All of this movement was run through the EV3 brick because that was the only system that could play an mp3 file. We installed a lego button in Gim's stomach to trigger the dancing commands. The dancing commands include Gim's entire platform moving (dancing) and a baby shark mp3 being played on the EV3 brick. 


The following video shows Gim recieving a squeeze and dancing:

Puppet Show: Text

All of this movement is driven by the pyboard, which enables the grove sensor and the servo motors to communicate.
The following video shows Gim getting a high five and hugging:

Puppet Show: Gallery

Aspects Not Included:

An additional aspect of the project that would have been implemented was Gim's ability to drive around in accordance to someone steering on SystemLink. We were able to get the wheels fully responsive to SystemLink speed commands. We had a lot of issues with having the lego motors spin as the same speed while being driven by the pyboard. We eventually fixed these by fixing our code. The code had the wrong Timer and channel names initially and after they were fixed they started to spin, yet one of the motors did not spin according to the input from Systemlink. To fix this issue, the speed for on that motor was adjusted in our code according to that. 

In the first couple iterations of GIM, we used an OpenMV camera instead of the grove motion sensor. Our idea was to have the OpenMV detect if a face was in view and then hug once it sees someone. This was working when we would hold the OpenMV, but mounting the camera on Gim made it less reliable in reading faces. We decided that we could replace this with a grove range sensor to make the whole system more reliable.

The last aspect not included is having an introductory talk from Gim in a baby voice. We created the voice using an app on our phones. The idea was to make Gim talk about his features whenever there was a face, however it became really hard to do. Then we wanted to make him talk when the button is squeezed in his tummy, however the quality of the sound was really low that people would not have been able to understand what he was saying.

Lessons Learned:

The first lesson we learned was that combining all of the codes never work as planned. We had our codes working well separately, however when we combined all of them together we face with unexpected errors. Also, our OpenMV code worked well when it was getting power from our laptop. However, after we connected everything properly and gave our pyboard and camera power from a battery, they worked but not as well as they were working when plugged into the computer. To have a more reliable system we decided to go with the distance grove sensor.

The second lesson learned was that we wanted to include too many features, but we also knew they all might not have worked perfectly when they are together, so we decided to have a list of things we want and move on to the next thing on the list as soon as all the previous ones work the way we want. Here is the list we created:

Puppet Show: About

Our last lesson learned was to be able to give up on a feature that actually works but not always. Our remote control feature was working however when we made the EV3 car more compact and created the box to put on top of the car, the weight and the friction became a problem. Having wheels to use the remote control was blocking Gim to dance. To have the dancing feature we gave up on this feature and decreased the friction on the wheels that would have been a part of remote control. 

Overall, although we could not include some features that we wanted to have, we realized we were capable of using everything we learned since the beginning of this semester. 

Programs that are used:

  • Python

  • nScope

  • SystemLink

  • LEGO Mindstorm EV3

The codes used can be found on this github link :

Puppet Show: Text

©2019 by Tahsin Can Sarlak. Proudly created with Wix.com

bottom of page