As Arthur C. Clarke’s third law states, “Any sufficiently advanced technology is indistinguishable from magic.” Well, even though I wrote the software for this, I can’t shake the feeling it is magic.
Ok, so the setup is thus:
You put a white piece of paper on your desk. You take a black marker, and draw a shape. It can be anything, as long as encloses an area. You then can press the shape, and it acts like a button. I made the button make a sound, but it could do anything.
Magic right? this is what happens in cartoons – you draw an eject button, then you hit it and you eject. This can work in real life?!?
Or, if you would like a more musical version, and are a fan of young Eddie Murphy movies:
Anyway, I did this by:
- detecting the table using the surface normals
- Isolating the white area on the table
- detecting dark areas on the table and removing points near any dark pixel
- segmenting the remaining points using Euclidean distance
- The clusters now represent buttons
- I track the buttons between point cloud messages,so if the paper moves by a small amount the associations are not lost
- I detect the presence of the hand by searching for points in the area over the table
- If a hand is detected, I see which button has a point over it that is closest to the table
- if that point is less than a centimeter, I consider it a button hit.
One thing I recommend for doing demos like this and the piano demo, is the simple fast medial library. It allows you to play .wav files with an imperceptibly small delay, and the sounds can overlap however much you want. The interface is very simple as well:
Check out the code for this demo at http://www.ros.org/wiki/mit-ros-pkg/KinectDemos/ImpromptuButton