Both sides previous revision
Previous revision
Next revision
|
Previous revision
Next revision
Both sides next revision
|
research:robotvision:replanning [2017/03/07 11:30] usenko |
research:robotvision:replanning [2019/06/04 14:17] usenko |
Contact: [[members:usenko|Vladyslav Usenko]]. | |
| |
<html><center><iframe width="560" height="315" src="https://www.youtube.com/embed/jh6tMHjxHSY" frameborder="0" allowfullscreen></iframe></center></html> | |
| <html><iframe width="720" height="360" src="https://www.youtube.com/embed/jh6tMHjxHSY" frameborder="0" allowfullscreen></center></html> |
| |
| |
| Contact: [[members:usenko|Vladyslav Usenko]]. |
| |
<html><br><br><h1 class="sectionedit1">Abstract</h1></html> | <html><br><br><h1 class="sectionedit1">Abstract</h1></html> |
In this work we present a real-time approach for local trajectory replanning for MAVs. Current trajectory generation methods for multicopters achieve high success rates in cluttered environments, but assume the environment is static and require prior knowledge of the map. In our work we utilize the results of such planners and extend them with local replanning algorithm that can handle unmodeled (possibly dynamic) obstacles while keeping MAV close to the global trajectory. To make our approach real-time capable we maintain information about the environment around MAV in an occupancy grid stored in 3D circular buffer that moves together with a drone, and represent the trajectories using uniform B-splines. This representation ensures that trajectory is sufficiently smooth and at the same time allows efficient optimization. | In this paper, we present a real-time approach to local trajectory replanning for microaerial vehicles (MAVs). Current trajectory generation methods for multicopters achieve high success rates in cluttered environments, but assume that the environment is static and require prior knowledge of the map. In the presented study, we use the results of such planners and extend them with a local replanning algorithm that can handle unmodeled (possibly dynamic) obstacles while keeping the MAV close to the global trajectory. To ensure that the proposed approach is real-time capable, we maintain information about the environment around the MAV in an occupancy grid stored in a three-dimensional circular buffer, which moves together with a drone, and represent the trajectories by using uniform B-splines. This representation ensures that the trajectory is sufficiently smooth and simultaneously allows for efficient optimization. |
| |
<html><br><br><h1 class="sectionedit1">Open-Source Code</h1></html> | <html><br><br><h1 class="sectionedit1">Open-Source Code</h1></html> |
The full source code is available on Github under LGPLv3 [[https://github.com/vsu91/ewok|https://github.com/vsu91/ewok]]. | The full source code is available on Github under LGPLv3 [[https://github.com/vsu91/ewok|https://github.com/vsu91/ewok]]. |
| |
| |
| Some examples use the forest_gen dataset available here [[https://github.com/ethz-asl/forest_gen|https://github.com/ethz-asl/forest_gen]]. |
| |
| |