While robots that are easier to program and integrate along side humans are spreading rapidly, they are still missing some sensors that can allow for the randomness associated real world applications. Many robots have sight, but they are unable to actually feel the things they are touching. There are a number of promising technologies out there attempting add touch and this invention seems to be a pretty inexpensive and easy to implement solution. In the video, a Baxter robot is equipped by researchers at MIT and Northeastern University with a novel tactile sensor that lets it grasp a USB cable and insert it into a USB port.  Their sensor is called GetSight, which is basically a plastic cube with a layer of rubber coated on one side with a metallic paint.

The four walls of the cube adjacent to the sensor face conduct light of different colors toward the rubber. When the rubber is deformed, light bounces off the metallic paint and is captured by a camera that is mounted on the cube. From the different intensities of the colored light, algorithms can infer the three-dimensional structure of the ridges or depressions of the surface being touched. Below the sensor can be seen showing the intricate ridges of a human finger. That is pretty good resolution and more than enough for most tasks a robot would be doing.


This sensor is cool, but is it actionable? The answer here appears to be yes. The example with a Baxter robot shows how it is able to function in an environment with changing variables. This makes it a lot easier to manipulate items that are not in a uniform position or that require ubiquity. The ability to have touch means that robots are able to function in more applicable and fluid situations.

self detectin

So most modern robots have eyes (cameras) and ears (microphones), but are generally missing touch, taste and smell. There are machines that can identifies smells, and there is actually one mandated to be in every residence in the USA: a smoke detector. While smell is important for specific chemicals, it’s not very useful for most robots, and neither is taste. Arguably more important than taste and smell and even sight and hearing is touch, so it can be game changing to add this sense to robots. The more robots can model human senses, the more they can model human abilities and the more than can directly replace humans without changing systems and work flows. I hope we see this sensor or other versions come to market that can allow us to build robots than can not just manipulate the world, but feel it.