A lot of times we wait around for technology to just “appear” like we see in films. But if we continue to do that, nothing will get built. Thankfully, people are out there filling in the gaps.
Zigelbaum showed me what he was working on during the first night of MIT’s Futures of Entertainment 3. When he told me about it, I knew it was the thing I had thought about before when the opportunity to visit MIT had presented itself.
All of this was apparently built off of the original prototype system used in Minority Report. What made me happy was that the UI was exceedingly smooth and enjoyable to use. It was a relief. A complete and total relief.
A tremendous thanks to Jamie for letting me take pictures and showing me the lab. He is a very interesting and awesome person and I highly recommend his existence to you.
These are new gloves (he said that eventually they wanted the gloves to not have to be worn, and for the device itself to be able to recognize gestural movement.
G-stalt – the rules for interacting with the system. A set of a new sign language for controlling movement from a distance. It was great to have these up on the wall when I tried on the gloves to interact with the system. Very intuitive and simple to learn. Especially with the rewards of being able to move things across the room without touching them.
Sensor/Cam: part of the set of twelve needed for the full control of the system. Less can be used, but the resolution of movement suffers with each lost sensor.
Here are Some Moving Pictures For You
And here’s the tiny amount of video footage I was able to get (in chunks, in between furiously deleting old videos from my camera to make room for the new. Hopefully it conveys the excellence of the system).
Zigelbaum also showed me something secret that I can’t say anything about (yet). Let us just say that it was very cool, and that it will be public soon enough.You can read more about Gesture Recognition on Wikipedia if you want. Or you could run into me somewhere and hear a lot of hot air on what I’ve been calling “8 bit haptics”.
Anywho, this stuff rocked. The Media Lab was ultimate. I can’t wait to go back.
Jamie Zigelbaum is Ph.D. student in the Tangible Media Group at the MIT Media Lab. His research interests include the social implications of physical interface media, frameworks for next generation interfaces, and tangible interfaces for abstract digital information. He received a B.A. from Tufts university working with Professor Rob Jacob. At Tufts he created a multidisciplinary major in HCI, drawing from neuroscience, psychology, computer science, and human factors engineering.
Amber Case is a Cyborg Anthropologist who also posts over at the Makerlab Blog, which is something you might enjoy reading if you enjoyed reading the post above. It’s about more experimental tech and activities related to pushing the limits of art and technology. If not, you can always follow her on Twitter @caseorganic.