toplogo
Sign In

Efficient 6-D Trajectory Generation for Omnidirectional Multirotor Aerial Vehicles in Cluttered Environments


Core Concepts
This paper proposes an efficient 3-stage optimization-based framework to generate safe and dynamically feasible 6-D trajectories for omnidirectional multirotor aerial vehicles (OMAVs) in cluttered environments.
Abstract
The paper presents a 3-stage optimization-based framework for generating 6-D trajectories for OMAVs in cluttered environments: Initial Path Search: An initial feasible path from the start to the end point is obtained using RRT. SFC Generation: A 3-D safe flight corridor (SFC) is generated based on the initial path using RILS, representing the collision-free regions. 6-D Trajectory Optimization: An efficient optimization-based method is proposed to generate a smooth, safe, and dynamically feasible 6-D trajectory within the SFC. Key aspects include: Representing the vehicle's attitude as a free 3-D vector using stereographic projection to eliminate constraints inherent in the SO(3) manifold. Formulating the trajectory generation as a constrained optimization problem and transforming it into an unconstrained one that can be solved efficiently using quasi-Newton methods. Considering whole-body safety constraints by modeling the vehicle's shape as a cuboid and confining the trajectory to the SFC. Simulations in cluttered environments and real-world experiments on a tilt-rotor hexarotor aerial vehicle demonstrate the effectiveness and efficiency of the proposed framework, allowing OMAVs to navigate safely in complex environments.
Stats
The maximum velocity, acceleration, and angular velocity limits are set as vmax = 0.6 m/s, amax = 2.0 m/s^2, and ωmax = 0.5 rad/s, respectively.
Quotes
"As fully-actuated systems, omnidirectional multi-rotor aerial vehicles (OMAVs) have more flexible maneuverability and advantages in aggressive flight in cluttered environments than traditional underactuated MAVs." "In some extreme scenarios, such as a narrow straight passage, traditional MAVs coupling acceleration with attitude will be most likely unable to pass through it without collision, while OMAVs can tilt themselves to adapt to the narrow space by controlling the attitude and simultaneously, control its position to achieve smooth and collision-free passing."

Deeper Inquiries

How can the proposed framework be extended to handle dynamic obstacles in the environment

To extend the proposed framework to handle dynamic obstacles in the environment, we can incorporate real-time perception and tracking systems. By integrating sensors such as LiDAR, cameras, or radar, the OMAV can detect and track moving obstacles in its surroundings. The trajectory optimization algorithm can then be updated dynamically based on the changing obstacle positions. This would involve continuously updating the obstacle information in the optimization problem and adjusting the generated trajectory to avoid collisions with dynamic obstacles. Additionally, predictive modeling techniques can be employed to anticipate the future positions of dynamic obstacles and plan trajectories accordingly to ensure safe navigation.

What are the potential limitations of the stereographic projection-based attitude representation, and how can they be addressed

The stereographic projection-based attitude representation may have limitations in terms of singularity issues and representation accuracy. One potential limitation is the possibility of gimbal lock, where certain orientations may lead to a loss of one degree of freedom in the representation. To address this, alternative rotation parameterization methods such as quaternions or Euler angles could be considered. Quaternions, in particular, offer a compact representation of rotations without gimbal lock issues. Additionally, incorporating constraints or regularization techniques during optimization to prevent singularities in the representation can help improve the robustness of the framework.

How can the framework be integrated with onboard sensing and perception systems to enable fully autonomous navigation of OMAVs in unknown environments

Integrating the framework with onboard sensing and perception systems is crucial for enabling fully autonomous navigation of OMAVs in unknown environments. By incorporating sensors for environment perception, such as LiDAR for obstacle detection and mapping, cameras for visual navigation, and inertial measurement units (IMUs) for localization, the OMAV can gather real-time data about its surroundings. This information can then be fed into the trajectory generation framework to adapt the planned trajectory based on the perceived environment. Machine learning algorithms can be utilized for object detection, tracking, and scene understanding to enhance the OMAV's ability to navigate autonomously. By fusing data from multiple sensors and leveraging advanced algorithms for perception and decision-making, the OMAV can navigate complex and dynamic environments with autonomy and efficiency.
0
visual_icon
generate_icon
translate_icon
scholar_search_icon
star