The content describes a teleoperation system that integrates visuo-tactile sensing and haptic feedback to enhance the user's ability to manipulate objects remotely.
The key highlights are:
The system uses a GelSight Mini sensor mounted on the robot's end-effector to provide high-resolution tactile information about the points of contact with objects.
Two methods are proposed for estimating the forces acting on the GelSight sensor: one based on optical flow analysis and one using a deep learning approach.
The force information is then converted into vibrotactile feedback and transmitted to the user through MANUS haptic gloves, providing them with a sense of the forces being applied during manipulation.
The system is integrated into a virtual reality teleoperation pipeline, where the user controls a dual-arm Tiago robot and receives both visual and haptic feedback.
A preliminary user study shows that the addition of haptic feedback can reduce the deformation of a plasticine ball by 48% compared to using only visual feedback, indicating improved dexterity and precision in manipulation tasks.
Future work includes utilizing the rich data from the visuotactile sensors to provide additional haptic feedback, such as shear forces, slip, and texture, further reducing the reliance on visual cues.
לשפה אחרת
מתוכן המקור
arxiv.org
תובנות מפתח מזוקקות מ:
by Noah Becker,... ב- arxiv.org 05-01-2024
https://arxiv.org/pdf/2404.19585.pdfשאלות מעמיקות