toplogo
Accedi

UKF-Based Sensor Fusion for Joint-Torque Sensorless Humanoid Robots Analysis


Concetti Chiave
The author proposes a novel sensor fusion approach based on Unscented Kalman Filtering to estimate joint torques of humanoid robots without torque sensors, aiming at enhancing control architectures for human-robot interaction.
Sintesi

The content discusses a novel sensor fusion method using Unscented Kalman Filtering to estimate joint torques in humanoid robots without torque sensors. The proposed approach integrates various measurements and non-directly measurable effects to improve control architecture. Extensive testing on the ergoCub robot demonstrates the effectiveness of the method, showcasing low root mean square errors in torque tracking even in the presence of external contacts. The paper compares the proposed strategy with the existing state-of-the-art approach based on the recursive Newton-Euler algorithm, highlighting improvements in estimation accuracy. The study also presents experiments validating the joint torque estimation method and controller architecture used on the ergoCub robot.

edit_icon

Personalizza riepilogo

edit_icon

Riscrivi con l'IA

edit_icon

Genera citazioni

translate_icon

Traduci origine

visual_icon

Genera mappa mentale

visit_icon

Visita l'originale

Statistiche
Results demonstrate low root mean square errors in torque tracking, ranging from 0.05 Nm to 2.5 Nm. Friction parameters for leg joints are provided: hip pitch (4.9 Nm), hip roll (4.0 Nm), hip yaw (2.5 Nm), knee (2.3 Nm), ankle pitch (2.3 Nm), ankle roll (1.3 Nm).
Citazioni
"Our approach uses motor current measurements to accurately estimate the joint torque with small error." "The validation results show that the novel method ensures effective tracking of desired torques and high-level tasks."

Approfondimenti chiave tratti da

by Ines Sorrent... alle arxiv.org 02-29-2024

https://arxiv.org/pdf/2402.18380.pdf
UKF-Based Sensor Fusion for Joint-Torque Sensorless Humanoid Robots

Domande più approfondite

How can incorporating tactile sensors enhance external contact point estimation?

Incorporating tactile sensors can enhance external contact point estimation by providing additional information about the forces and pressures exerted on the robot's surface during interactions. Tactile sensors can detect subtle changes in pressure distribution, allowing for a more accurate determination of the location and intensity of external contacts. By integrating tactile sensors into the sensor fusion framework, it becomes possible to improve the estimation of contact points, especially when dealing with complex surfaces or varying levels of force applied to different parts of the robot's body.

What are potential challenges when integrating more advanced friction models into the identification process?

Integrating more advanced friction models into the identification process may pose several challenges. One challenge is related to model complexity, as advanced friction models often involve intricate mathematical formulations that require precise parameter tuning and calibration. Ensuring that these models accurately capture all aspects of friction behavior without introducing unnecessary computational overhead or inaccuracies is crucial. Another challenge lies in data availability and quality. Advanced friction models may rely on extensive datasets for training and validation purposes, which could be challenging to obtain in real-world scenarios. Additionally, noise in sensor measurements or uncertainties in environmental conditions could impact the accuracy of these models, leading to suboptimal performance during identification processes. Furthermore, implementing sophisticated friction models requires a deep understanding of mechanical dynamics and control theory. Integrating these models effectively into existing control architectures without destabilizing or complicating system behavior poses a significant technical challenge that needs careful consideration.

How does this research impact advancements in whole-body tasks like balancing or walking for humanoid robots?

This research significantly impacts advancements in whole-body tasks like balancing or walking for humanoid robots by addressing critical issues related to joint torque estimation without relying on traditional joint-torque sensors. By developing a novel sensor fusion approach based on Unscented Kalman Filtering (UKF), this research enables accurate online estimation of joint torques even in situations where direct measurement is not feasible due to space constraints or mechanical complexities. The integration of distributed sensors such as accelerometers, gyroscopes, motor current sensors along with FT sensors allows for comprehensive data fusion that considers multi-modal measurements essential for robust torque control strategies required for dynamic tasks like balancing and walking. The proposed UKF-based sensor fusion algorithm enhances robustness against external disturbances while improving torque tracking accuracy across various joints within humanoid robots. Overall, this research paves the way for enhanced performance capabilities in whole-body tasks by providing reliable joint torque estimates necessary for maintaining stability during complex movements such as balancing on uneven terrain or executing coordinated walking patterns efficiently.
0
star