AI News, Mercedes Tries to Conquer the Last Mile With Cute Delivery Drones and Bots

Mercedes Tries to Conquer the Last Mile With Cute Delivery Drones and Bots

Mercedes-Benz is experimenting withsmall botsto carry cargo over thatlast, pesky mile to the customer, using a human-driven van toferry the themover the previous stretch.

The two kinds of bots, aerial and terrestrial, arebeing supplied by companies we’ve already written about.Matternetis providingits slickM2 quadcopter,several of which would perch on the van’s roof.Starship Technologiesis providing itssix-wheel robot, eight of which can fit inside a van.

“It can drop and reload a payload and battery without human interaction,” the company says,“and features a smart payload box that can transmit data about its contents and destination.

It also has precision landing capabilities and captures proof of delivery.” However, the M2 follows a predetermined route and apparently does not have theability to sense and avoid unexpected obstacles, like a passing bird or a recently fallen branch.

Forget Drones: Meet The Robot That Could Be The Future of Deliveries

Right now on Mars, NASA’s Curiosity Rover is busy surveying the planet’s surface, capturing photos of another world and ultimately trying to help us understand our place in the universe.

Launched in 2014 by Skype co-founders Ahti Heinla and Janus Friis, the idea is that the robot will solve the “last mile” problem, of how to get whatever you’ve ordered - be it take-away, groceries, or parcels - to your front door, as cheaply as possible.

On top of this, autonomous cars have a lot of data and intelligence they can plug into: Roads have been extensively mapped by the likes of Google and Uber - meaning that all of the important obstacles and road markings are already digitally coded.

The company tells me that the intention is to eventually make 99% of deliveries fully automated - with the last 1% of navigation the responsibility of remote teams of operators, for making tricky decisions about unexpected obstacles or difficult situations.

It sounds as though it will work a bit like a call centre - with bots that find themselves stuck or stationary for an extended period of time automatically being flagged up to people stationed in London, San Francisco or Thailand who can then figure things out.

In an almost suspicious piece of good timing, a group of students from nearby Ravensbourne College spotted the robot, and subsequently said things like “cool dude”, like what young people say.

Similarly, so far, there have been no attempts by anyone to steal either the robots or the payload, though Starship does reckon it will inevitably happen as the company grows - which is why as well as the tracking gear, the robot also has a built in alarm.

The company came out of what they call “stealth mode” last month, when they first unveiled the robot, and is now engaged in a second phase of testing with a number of commercial partners around the world.

Interestingly, much like Uber has discovered, Starship has found different countries and cities have radically different rules on whether you’re allowed to fire up your autonomous robot and send it racing down the pavement.

“Anyone can build a 100-grand robot in a lab”, the spokesperson told me, and while he wouldn’t give me an exact figure, he said that the aim is to get each robot costing roughly the same as a high-end mobile phone.

The reason appears to be that the robot is taking advantage of the mobile components boom that has enabled phone technologies - such as miniaturised high quality cameras and chips - that has also powered the drone and Internet of Things booms.

The relatively low unit price makes sense for the business model: If there’s going to be thousands of these roaming our streets eventually, they will need replacing and will need maintenance, and no commercial partners are going to take a risk with something that would be very expensive if it were to be, say, hit by a car.

And though these batteries should be plenty for short, sub-two mile deliveries, they could conceivably be swapped out for something of a larger capacity - there isn’t a technical problem of needing to minimise weight to contend with.

Given that the robots will still be partially human operated, and given the maintenance needs, if the company scales as it hopes, it will require hundreds or thousands of new employees at ground level and in operations centres to keep the fleet moving.

Friendly Neighborhood Delivery Drones Target Iceland

Delivery drones are carrying customer orders for burgers and smartphones across a bay of water straddled by the Icelandic capital of Reykjavik—and that’s just the start of a much more ambitious plan.

No company can yet claim to have solved all the problems facing delivery drones that would have to possibly navigate trees or power lines while delivering packages so close to a home or business.

Flytrex drones are currently delivering certain customer orders on behalf of AHA, an Icelandic e-commerce company that handles online marketplace and delivery transactions for restaurants, retails and grocery stores.

For example, the uncomplicated flight route and lack of obstacles means that Flytrex can automate the drones and allow them to deliver packages without direct human piloting or control.

The company began with one or two drones doing up to 20 deliveries per day when it began the Reykjavik operation last month, but planned to ramp up deliveries to as many as 40 or 50 flights per day.

To ensure a quick turnaround, people on the ground stand ready to swap in fresh batteries for the delivery drones and keep multiple packs of batteries always charging.

But the company seems to be betting that sending drones to a set number of street corner locations will still be fairly straightforward. “We’re going to start with a few tens or hundreds of fixed locations, such as street corners, so that you can choose the nearest street corner to get your delivery,”

A human pilot might be required to check each delivery drone’s drop-off point and take over in piloting if trees or power lines present more complicated obstacles.

Furthermore, many delivery drones that rely on the multi-rotor design of vertical takeoff and landing tend to still have somewhat limited delivery ranges and payload weight based on battery life.

Drones Going Postal – A Summary of Postal Service Delivery Drone Trials

Earlier this year in the Bavarian mountains, DHL completed a 3 month test of its automated drone delivery system dubbed the Parcelcopter.

A total of 130 autonomous loading/unloading cycles were completed during the trial, with the drone covering 8 km (5 miles) of mountain terrain within 8 minutes of take-off.

The Skyport, which weighs 14.5 tons, measures 5 x 5 x 3.5 meters (16.4 x 16.4 x 11.5 feet) and can handle up to two drones which automatically align to a 3 x 3 meter (9.8 x 9.8 feet) landing pad.

Curiosity (rover)

Curiosity is a car-sized rover designed to explore Gale Crater on Mars as part of NASA's Mars Science Laboratory mission (MSL).[3] Curiosity was launched from Cape Canaveral on November 26, 2011, at 15:02 UTC aboard the MSL spacecraft and landed on Aeolis Palus in Gale Crater on Mars on August 6, 2012, 05:17 UTC.[7][8][13] The Bradbury Landing site was less than 2.4 km (1.5 mi) from the center of the rover's touchdown target after a 560 million km (350 million mi) journey.[9][14] The rover's goals include an investigation of the Martian climate and geology;

If the specimen warrants further analysis, Curiosity can drill into the boulder and deliver a powdered sample to either the SAM or the CheMin analytical laboratories inside the rover.[57][58][59] The MastCam, Mars Hand Lens Imager (MAHLI), and Mars Descent Imager (MARDI) cameras were developed by Malin Space Science Systems and they all share common design components, such as on-board electronic imaging processing boxes, 1600×1200 CCDs, and an RGB Bayer pattern filter.[60][61][62][63][64][65] It has 17 cameras: HazCams (8), NavCams (4), MastCams (2), MAHLI (1), MARDI (1), and ChemCam (1).[66] The MastCam system provides multiple spectra and true-color imaging with two cameras.[61] The cameras can take true-color images at 1600×1200 pixels and up to 10 frames per second hardware-compressed video at 720p (1280×720).[67] One MastCam camera is the Medium Angle Camera (MAC), which has a 34 mm (1.3 in) focal length, a 15° field of view, and can yield 22 cm/pixel (8.7 in/pixel) scale at 1 km (0.62 mi).

The other camera in the MastCam is the Narrow Angle Camera (NAC), which has a 100 mm (3.9 in) focal length, a 5.1° field of view, and can yield 7.4 cm/pixel (2.9 in/pixel) scale at 1 km (0.62 mi).[61] Malin also developed a pair of MastCams with zoom lenses,[68] but these were not included in the rover because of the time required to test the new hardware and the looming November 2011 launch date.[69] However, the improved zoom version was selected to be incorporated on the upcoming Mars 2020 mission as Mastcam-Z.[70] Each camera has eight gigabytes of flash memory, which is capable of storing over 5,500 raw images, and can apply real time lossless data compression.[61] The cameras have an autofocus capability that allows them to focus on objects from 2.1 m (6 ft 11 in) to infinity.[64] In addition to the fixed RGBG Bayer pattern filter, each camera has an eight-position filter wheel.

The ChemCam instrument suite was developed by the French CESR laboratory and the Los Alamos National Laboratory .[71][72][73] The flight model of the mast unit was delivered from the French CNES to Los Alamos National Laboratory.[74] The purpose of the LIBS instrument is to provide elemental compositions of rock and soil, while the RMI will give ChemCam scientists high-resolution images of the sampling areas of the rocks and soil that LIBS targets.[71][75] The LIBS instrument can target a rock or soil sample up to 7 m (23 ft) away, vaporizing a small amount of it with about 50 to 75 5-nanosecond pulses from a 1067 nm infrared laser and then observing the spectrum of the light emitted by the vaporized rock.[76] ChemCam has the ability to record up to 6,144 different wavelengths of ultraviolet, visible, and infrared light.[77] Detection of the ball of luminous plasma will be done in the visible, near-UV and near-infrared ranges, between 240 nm and 800 nm.[71] The first initial laser testing of the ChemCam by Curiosity on Mars was performed on a rock, N165 ('Coronation' rock), near Bradbury Landing on August 19, 2012.[78][79][80] The ChemCam team expects to take approximately one dozen compositional measurements of rocks per day.[81] Using the same collection optics, the RMI provides context images of the LIBS analysis spots.

REMS will provide new clues about the Martian general circulation, micro scale weather systems, local hydrological cycle, destructive potential of UV radiation, and subsurface habitability based on ground-atmosphere interaction.[86] The rover has four pairs of black and white navigation cameras called hazcams, two pairs in the front and two pairs in the back.[82][88] They are used for autonomous hazard avoidance during rover drives and for safe positioning of the robotic arm on rocks and soils.[88] Each camera in a pair is hardlinked to one of two identical main computers for redundancy;

and mechanisms for scooping, sieving, and portioning samples of powdered rock and soil.[116][118] The diameter of the hole in a rock after drilling is 1.6 cm (0.63 in) and up to 5 cm (2.0 in) deep.[117][120] The drill carries two spare bits.[120][121] The rover's arm and turret system can place the APXS and MAHLI on their respective targets, and also obtain powdered sample from rock interiors, and deliver them to the SAM and CheMin analyzers inside the rover.[117] Since early 2015 the percussive mechanism in the drill that helps chisel into rock has had an intermittent electrical short.[122] On December 1, 2016, the motor inside the drill caused a malfunction that prevented the rover from moving its robotic arm and driving to another location.[123] The fault was isolated to the drill feed brake,[124] and internal debris is suspected of causing the problem.[122] By December 9, driving and robotic arm operations were cleared to continue, but drilling remained suspended indefinitely.[125] The Curiosity team continued to perform diagnostics and testing on the drill mechanism throughout 2017.[126] Curiosity has an advanced payload of scientific equipment on Mars.[53] It is the fourth NASA unmanned surface rover sent to Mars since 1996.

Curiosity is 2.9 m (9.5 ft) long by 2.7 m (8.9 ft) wide by 2.2 m (7.2 ft) in height,[24] larger than Mars Exploration Rovers, which are 1.5 m (4.9 ft) long and have a mass of 174 kg (384 lb) including 6.8 kg (15 lb) of scientific instruments.[23][127][128] In comparison to Pancam on the Mars Exploration Rovers, the MastCam-34 has 1.25× higher spatial resolution and the MastCam-100 has 3.67× higher spatial resolution.[64] The region the rover is set to explore has been compared to the Four Corners region of the North American west.[129] Gale Crater has an area similar to Connecticut and Rhode Island combined.[130] Colin Pillinger, leader of the Beagle 2 project, reacted emotionally to the large number of technicians monitoring Curiosity's descent, because Beagle 2 had only four people monitoring it.[131] The Beagle 2 team made a virtue out of necessity;

Curiosity, on the other hand, was active when it touched down on the surface of Mars, employing the rover suspension system for the final set-down.[142] Curiosity transformed from its stowed flight configuration to a landing configuration while the MSL spacecraft simultaneously lowered it beneath the spacecraft descent stage with a 20 m (66 ft) tether from the 'sky crane' system to a soft landing—wheels down—on the surface of Mars.[143][144][145][146] After the rover touched down it waited 2 seconds to confirm that it was on solid ground then fired several pyros (small explosive devices) activating cable cutters on the bridle to free itself from the spacecraft descent stage.

The NASA website momentarily became unavailable from the overwhelming number of people visiting it,[149] and a 13-minute NASA excerpt of the landings on its YouTube channel was halted an hour after the landing by an automated DMCA takedown notice from Scripps Local News, which prevented access for several hours.[150] Around 1,000 people gathered in New York City's Times Square, to watch NASA's live broadcast of Curiosity's landing, as footage was being shown on the giant screen.[151] Bobak Ferdowsi, Flight Director for the landing, became an Internet meme and attained Twitter celebrity status, with 45,000 new followers subscribing to his Twitter account, due to his Mohawk hairstyle with yellow stars that he wore during the televised broadcast.[152][153] On August 13, 2012, U.S. President Barack Obama, calling from aboard Air Force One to congratulate the Curiosity team, said, 'You guys are examples of American know-how and ingenuity.