(From the SIFTABLES site @ MIT, by David Merrill): Imagine overturning a container of nuts and bolts, then looking through the resulting pile for a particular item. Or spreading photographs out on a tabletop and then beginning to sort them into piles. During these activities we interact with large numbers of small objects at the same time, and they utilize all of our fingers and both hands together. We humans are skilled at using our hands in these ways, and can effortlessly sift and sort – focusing on our higher level goals rather than the items themselves. Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications. The Siftables interaction platform is a collaboration with Jeevan Kalanithi. Videos: (available at the link) Selected images (others are available at the link above):
Siftables can sense their neighbors, allowing applications to utilize topological arrangement
No special sensing surface or cameras are needed
Exploded view of a siftable module.