قالب وردپرس درنا توس
Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Technology https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Massachusetts Institute of Technology's new robot can identify things with sight and touch

Massachusetts Institute of Technology's new robot can identify things with sight and touch



The team took KUKA's hand and added a tactile sensor, GelSight, created by Ted Adelson's group at CSAIL. The information gathered by GelSight was then passed on to AI to learn the link between visual and tactile information.

To teach AI how to identify objects by touch, the team recorded 12,000 videos of 200 objects, such as textiles, tools and household objects that touched. Videos are broken down into still images and AI uses this data set to connect tactile and visual data.

"Looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge," says Yunzhu Li, a CSAIL PhD student and lead author of a new book on the system. "By blindly touching, our model can predict the interaction with the environment only by tactile feelings. The alignment of these two senses can strengthen the robot and reduce the data that may be needed for tasks related to manipulation and capture objects. "

the robot can only identify objects in a controlled environment. The next step is to build a larger set of data to allow the robot to work in more diverse settings.

"Methods like this have the potential to be very useful for robotics, where you have to answer questions such as" whether this object is heavy or soft? "Or" if I picked up this cup with my handle, how good would my grip be? "Says Andrew Owens, a PhD student at the University of California at Berkeley. "This is a very difficult issue because the signals are so different and this model demonstrates great capability."


Source link