AI News, Video Friday: RoboCup Finals, Crowdsourced Robotics, and Growing Drones in Vats
- On Sunday, February 18, 2018
- By Read More
Video Friday: RoboCup Finals, Crowdsourced Robotics, and Growing Drones in Vats
Reporting in Science today, researchers from Georgia Institute of Technology, Carnegie Mellon University, Clemson University and National Institute for Mathematical and Biological Synthesis described results of a groundbreaking study to answer this question using amphibious fish, a custom-built robot and mathematical models of movement.
The researchers developed a simplified, mudskipper-like version of a robot, which they call “MuddyBot,” on which they could systematically vary the angle and movements of the robot’s flipper-like limbs and tail.
“Insight from these experiments led us to hypothesize that propulsive use of the tail, an appendage that has received relatively little attention in previous studies of the invasion of land, may have been the critical adaptation that allowed these early walkers to gain ground on challenging substrates,” says Benjamin McInroe, a co-author on the paper and then an Georgia Tech undergraduate (now a Ph.D.
At a low level, we use motion capture to measure the position of the robot and the canvas, and a robust control algorithm to command the robot to fly to different stipple positions to make contact with the canvas using an ink soaked sponge.
We describe a collection of important details and challenges that must be addressed for successful control in our implementation, including robot model estimation, Kalman* filtering for state estimation, latency between motion capture and control, radio communication interference, and control parameter tuning.
We use a centroidal Voronoi diagram to generate stipple drawings, and compute a greedy approximation of the traveling salesman problem to draw as many stipples per flight as possible, while accounting for desired stipple size and dynamically adjusting future stipples based on past errors.
am very very sad that Harvard’s amorphous construction project seems to be over, but this video that was just published for some reason shows “the final autonomous robot design” doing its foamy thing:
In Version 2 the camera looks ahead at the line in front determines the angle of those lines with respect to the robot centre and then moves faster if the line looks straight and slows down if the line turns.
We taught our robot a 5 km network of interconnected paths and then carried out 120 km of autonomous repeats on these paths using only stereo vision for feedback.
Our technique is called Visual Teach and Repeat 2, and is a significant advance over our earlier work in that (i) it uses a Multi-Experience Localization (MEL) technique to match live images to several previous experience of a path, and (ii) is able to do place-dependent terrain assessment to safeguard the robot and people around it, even in rough terrain with vegetation.
- On Friday, January 18, 2019
Stippling with aerial robots
This is the supplementary video for our Computational Aesthetics publication presented at Expressive 2016. Stippling with aerial robots B. Galea, E. Kia, N. Aird, P. G. Kry Computer Science,...
His Hand Doesn't Even Move
A compilation of Professor Walter Lewins and some of his lectures at Massachusetts Institute of Technology (MIT). He draws some of the best lines, especially dashed lines - so fast that his...
Dot-drawing with drones
You may have heard of plans to use drones for delivering packages, monitoring wildlife, or tracking storms. But painting murals? That's the idea behind a project in Paul Kry's laboratory...
James Cheshire & Oliver Uberti: "Where the Animals Go: Tracking Wildlife [...]" | Talks at Google
For thousands of years, tracking animals meant following footprints. Now satellites, drones, camera traps, cellphone networks, and accelerometers reveal the natural world as never before. Where...
Numerical Tools for Non-Experts
Developing reliable numerical software has traditionally been a tedious process which requires significant expertise. Recently, our team at the University of Washington has been investigating...
Aaron Koblin: Artfully visualizing our humanity
Artist Aaron Koblin takes vast amounts of data -- and at times vast numbers of people -- and weaves them into stunning visualizations. From elegant lines tracing airline..
Villan for Pen shader Blender
this is a character for a game im making for my honours year at abertay university. Made in blender 2.49b For more info heres the blog about the project: