banner



Researchers Build AI System to Connect Vision, Touch - nestorswilifewouse

artificial intelligence ai vision touch

artificial intelligence ai vision touch

A team of researchers at the Massachusetts Institute of Technology (MIT) have come up with a predictive Artificial Intelligence (AI) that commode ascertain to see by affecting and to feel aside eyesight.

While our sense of touch gives us capabilities to palpate the physical world, our eyes help us understand the full fancy of these perception signals.

Robots, notwithstandin, that have been programmed to see or feel can't use these signals quite as interchangeably.

The new AI-based system can create realistic tactual signals from visual inputs, and predict which object and what part is being touched directly from those tactile inputs.

In the future, this could help with a more harmonious kinship between vision and robotics, especially for object recognition, grasping, better scene understanding and helping with seamless human-golem integration in an assistive or manufacturing setting.

"By looking at the scene, our model can imagine the feeling of affecting a flat surface or a sharp edge", same Yunzhu Li, PhD student and lead author from MIT's Computing and Artificial Intelligence Science laborator (CSAIL).

"By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings," 51 added.

The team exploited a KUKA robot arm with a special tangible sensor called GelSight, designed by some other group at MIT.

Using a simple web camera, the team tape-recorded nearly 200 objects, much every bit tools, household products, fabrics, and more, existence touched to a higher degree 12,000 times.

Breaking those 12,000 video clips down into static frames, the team compiled "VisGel," a dataset of to a higher degree three million optical/tactile-paired images.

"Delivery these two senses (vision and touch) together could empower the robot and cut back the data we might call for for tasks involving manipulating and grasping objects," said Li.

The current dataset only has examples of interactions in a controlled surroundings.

The team hopes to meliorate this aside aggregation information in more unorganised areas, or by victimisation a fres Massachusetts Institute of Technology-designed tactile glove, to better increase the size and multifariousness of the dataset.

"This is the first method that can convincingly translate between visual and touch signals", same Saint Andrew the Apostle Owens, a station-Department of Commerce at the University of California at Berkeley.

The team up is placed to present the findings next week at the "Conference connected Computer Visual sense and Pattern Recognition" in Eternal Beach, Calif..

Source: https://beebom.com/researchers-build-ai-system-to-connect-vision-touch/

Posted by: nestorswilifewouse.blogspot.com

0 Response to "Researchers Build AI System to Connect Vision, Touch - nestorswilifewouse"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel