New MIT robot can identify objects by sight and touch

Happy

  • The project
  • The robot

It is easy for humans to predict how an object will be perceived simply by looking at it or to tell what an object looks like simply by touching it, but this can instead be a great challenge for machines. Now, a new robot developed by MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) is trying to do just that.

The project

The team took a KUKA robot arm and added a tactile sensor called GelSight, created by Ted Adelson's group at CSAIL. The information collected by GelSight was then sent to an AI so that it could learn the relationship between visual and tactile information.

To teach the AI ​​to identify objects by touch, the team recorded 12,000 videos of 200 objects such as textiles, tools and household items. The videos were divided into still images and the AI ​​used this dataset to connect the tactile and visual data.

“By looking at the scene, our model can imagine the sensation of touching a flat surface or a sharp edge,” says Yunzhu Li, a doctoral student at CSAIL and lead author of a new paper on the system. “By walking around blindly, our model can predict interaction with the environment based on tactile sensations alone. Bringing these two senses together could strengthen the robot and reduce the data we might need for tasks involving manipulating and grasping objects. “

The robot

For now, the robot can only identify objects in a controlled environment. The next step is to create a larger data set so that the robot can work in several different locations.

RoboMaster S1: DJI’s new war robot arrives
READ

“Methods like this have the potential to be very useful for robotics, where questions like 'is this object hard or soft?', or 'if I lift this cup by its handle, what will my grip be ? ' », says Andrew Owens, a postdoctoral researcher at the University of California, Berkeley. “This is a very difficult problem, because the signals are very different and this model showed great capabilities. »

Post navigation