AI News, Successful field experiments with autonomos drones
- On Wednesday, June 6, 2018
- By Read More
Successful field experiments with autonomos drones
An autonomous drone can perceive with great detail the surrounding environment and acquire much more reliable results from analyzes and scans,' says George Nikolakopoulos.
'An autonomous aerial robot is programmed to analyze its surroundings and perform different tasks either towards inspection or interaction with the environment.
The advances in computational power that can allow more complicated control algorithms to run on board the drones and hence perceive and process the surrounding environment much faster, is one reason to why this research leap is happening right now.
'The main reason why we succeed is the use of the localization system that our group has developed based on fusion of Ultra Wide Band nodes and other onboard sensors.
- On Wednesday, June 20, 2018
- By Read More
Navigation and Self-Semantic Location of Drones in Indoor Environments by Combining the Visual Bug Algorithm and Entropy-Based Vision
The UAV is located on a given start point (unknown state) in the experimental indoor environment, at a given distance near with respect to a specified landmark in the topological map.
Through the entropy-based controller, it is executed a process of image entropy maximization (Search Mode) aimed at converging to a state of high entropy, hopefully containing landmarks from the visual topological map.
The signals ufb have been calculated by the feedback controller (reactive behavior) from the different images that are captured by the UAV's onboard camera, as well as the signals uff provided by the feedforward controller (anticipatory behavior).
Finally, the UAV reaches the target point or L1 landmark goal defined at the k = 19 iteration, and executes the specified maneuver of this arc: go forward to the goal, showing the exit door to an emergency situation.
The entropy-based controller generates the values of the control signals through image entropy maximization (entropic vision), performing a maneuver for guide the robot to the higher entropy state in each iteration.
When the left zone of image has the higher entropy (HL) the robot performs a turn to the left (yaw = [–1.0]), else if the right zone of image has the higher entropy (HR) the robot performs a turn to the right (yaw = [0.1]) in this case.
Experimentally, for the calculation of the combined signal ut, it has been established the following weight values: wfb = 0.7 and wff = 0.3, for the feedback and feedforward controllers respectively (Maravall et al., 2015b).
From the experimental results obtained in our laboratory it is concluded that the UAV is able to successfully perform in real time the fundamental skills of the visual bug algorithm, guiding the robot toward a goal landmark (in this case exit door) using self-semantic location in each landmark defined in the visual topological map.
- On Wednesday, September 18, 2019
Meet the dazzling flying machines of the future | Raffaello D'Andrea
When you hear the word "drone," you probably think of something either very useful or very scary. But could they have aesthetic value? Autonomous systems ...
Collaborative Navigation for Flying and Walking Robots
The results of this video have been published in: P. Fankhauser, M. Bloesch, P. Krüsi, R. Diethelm, M. Wermelinger, .
Active Monocular Localization: Towards Autonomous Monocular Exploration for Multirotor MAVs
Christian Mostegel, Andreas Wendel, Horst Bischof, "Active Monocular Localization: Towards Autonomous Monocular Exploration for Multirotor MAVs", ICRA ...
Incremental Segment-Based Localization in 3D Point Clouds
Accompanying video for our RA-L 2018 publication titled "Incremental Segment-Based Localization in 3D Point Clouds": ...
Rapyuta: A Cloud Robotics Framework
Computational power is a key enabler for intelligent and efficient robot task performance. However, on-board computation entails additional power requirements, ...
Full Quadrotor Autonomous Mission Simulation (Gazebo+ SITL)
In this video a fully autonomous simulation is provided. The Software-In-The-Loop is used. The gazebo ground truth data are used to provide the full quadrotor ...
Droyd - particle filters localization
Project DROYD (python@droid) Chassis: TAMIYA 4WD, mbed LPC1768. Odometry: None. Sensors: SRF08 ultrasonic (5x). Modion model: Probabilistic (Magic ...
Probabilistic Active Perception Planning For Autonomous Robots In Everyday Environments
Autonomous object localization on agricultural areas using air/ground cooperation
The video shows the autonomous localization of objects on agricultural areas using our autonomous team of cooperating ground and air robots. The aerial robot ...
Moral Math of Robots: Can Life and Death Decisions Be Coded?
A self-driving car has a split second to decide whether to turn into oncoming traffic or hit a child who has lost control of her bicycle. An autonomous drone needs ...