AI News, Adept Introduces Lynx Autonomous Mobile Platform

Adept Introduces Lynx Autonomous Mobile Platform

This is Lynx, a brand new (and very slick) little mobile robot from Adept.

This is what Lynx can do for you, too, without you having to come up with some sort of complicated and expensive deployment plan or infrastructure: just set it loose, load it up with 60 kilos of whatever you want, tell it where to go, and forget about it.

We did learn, however, that those circular panels on the sides are backed by LEDs that can be programmed to indicate the behavior of the robot (or what the robot is 'thinking'), and that you'll be able to add 'skins' to it if you want to change its appearance.

This may seem a bit weird to those of us who like robots and are comfortable with them, but in the markets that Lynx is designed to break into, robots come with a lot of perceptual baggage.

Robots and Healthcare Saving Lives Together

From radiation treatment to eye surgery, rehabilitation to hair transplantation, and robot therapists to robotic pharmacists, and even a robot phlebotomist, healthcare robots are transforming the fields of medicine across the globe.

The teleoperated robot-assisted surgical system has been used successfully on millions of patients since it was cleared by the U.S. Food and Drug Administration (FDA) in 2000.

And as a new age dawns for robots designed to work collaboratively with humans, medical applications will only gain momentum.

It’s certainly not a replacement for its human operator and most industry insiders agree that we’re a long way off from true autonomy.

The device uses the KUKA LBR iiwa robot to guide the laser beam to the precise location for the bone ablation procedure.

Compared to the conventional method of using an oscillating saw to cut through the bone, robot-guided laser osteotomy is reported to provide more precise cutting geometries, thereby minimizing the amount of ablated bone and thermal damage.

It also reduces soft tissue damage, promotes faster healing, andallows for complex 3D reconstruction geometries currently only capable with robots.It’s expected to be used in all forms of osteotomies,starting with craniomaxillofacial surgery.

The KUKA LBR iiwa (LBR stands for Leichtbauroboter, which is German for lightweight robot) represents a new breed of robots designed to be inherently safe out of the box, without the need for elaborate safety fencing common to many industrial robots.

Ryan says they have research projects at various universities around the world studying the use of KUKA’s lightweight, collaborative robot for various medical procedures.

“There’s lots of research and we have four or five OEMs trying to bring products to market using the lightweight robot.

“The nice thing about the ultrasound scanning is that the robot is force controlled, so the robot will compensate for the patient’s breathing and still keep consistent force,”

He says researchers in Europe are also using the lightweight robot for needle biopsy and minimally invasive surgical procedures.

So we are fully redundant in our sensing and in our logic, which is what the RIA Standard for safety requires to meet the minimum specification for human-robot collaboration.

This design gives the robot exceptional flexibility, especially in tight spaces or when working with its human collaborators.

At 2:10 into the clip, you see a researcher demonstrate the advantages of kinematic redundancy with hand-guided null space movement.

Shorter Development, Faster to Market Other features of the KUKA LBR robot also lend themselves to healthcare applications, and provide advantages for medical technology OEMs and startups in particular.

“We can do gravity compensation, where you put the payload onto the end of the robot and the robot treats it as if it’s part of the robot and only looks for small external forces,”

He says these numerous options and features are the reason technology developers buy the KUKA lightweight robot for medical applications, because it can save significant development time.

This FDA-cleared system turns a once painful, drawn-out process for people suffering from pattern baldness or other kinds of hair loss, into a more precise and comfortable procedure with no scarring and better outcomes.

In late 2006 the hair transplant system’s developer, Restoration Robotics, contacted Stäubli looking for an industrial arm that could handle the high performance levels of this unique application.

“They had their patents to do hair transplant with a six-axis industrial robotic arm, but they couldn’t find a robot that complied with the safety regulations at the time, nor achieved the sheer performance that the robotic arm needed to maintain,”

“It uses stereovision mounted on the end-of-arm tooling to locate a single hair follicle on the back of your head, fire a needle, and auger that hair out.

He says the old way, called the strip method, was to take a large blade and remove a long strip of scalp from the back of the head, dissect the hair follicles, and implant them.

Woods says Point Grey provides the stereovision camera and Restoration Robotics developed all the algorithms that help determine the best follicles for harvesting.

This is crucial when you’re working with submillimeter accuracy, extracting thousands of single hair follicles one after another.

“I’ve seen videos of the best surgeons in the world working inside of the eyeball and then I’ve seen video of our robot going inside a human eyeball, rotating and removing the cataract,”

“It is shocking the difference in how precise that robot holds its path and how steady it holds the apparatus compared to some of the best surgeons in the world.

“We found that we couldn’t hold that level of precision when we were fully extended at high speed, but if we work in a small envelope when the robot is in a comfortable position, we can hold those kinds of numbers.

Large-Scale Robotics in the Hybrid OR Robotics is transforming our hospital ORs into highly flexible environments where diagnostics, interventional therapies, and surgical procedures can be accommodated in one room, making patient care more efficient.

But it does require external sensors like vision systems and other sensors to make sure the robot is running at the speed it’s forecasting and everything is operating correctly.”

The zeego is the apex of automation for that, so they can image from here to here and they don’t have to align it to the table or figure out the reach.

Healthcare professionals are not robot programmers, so medical technology manufacturers like Siemens provide the intuitive interface that makes the robotic technology easy to use.

In large cities there is no place to put horses, so it came down to can I put a person at the end of a robot and simulate that motion?

“In medical, it’s generally someone on the research side that has discovered that some procedure would be beneficial to a patient or they want to test a theory.

Rehab Robots with Measurable Results From equine-inspired robotic therapists to robot arms for physical therapy and neuroscience research, rehabilitation robotics is growing rapidly.

Researchers at SRI International in Silicon Valley are working on the softer side of wearable robotics with hopes that someday children with muscular dystrophy may walk taller and stronger with the aid of Superflex, a soft robotics bodysuit.

Coupled with an immersive 3D video gaming experience, the robotic arms take rehabilitation to a new, engaging experience.

We have a partner in Germany that just developed a system for doing knees, where the patient can push on the robot with their leg and the robot will adjust the angle of applied force to minimize the forces on the injured area.

It allows athletes and patients to maintain their muscle tone after knee surgery or a knee injury.

In a typical week, a 200-bed hospital has to move 10,000 medication orders, 4,500 meals, 83,000 pounds of linens, and 70,000 pounds of trash.

is an autonomous mobile robot designed to transport the tons of goods and supplies that keep a hospital running.

“Hospitals are starting to realize that they’ve done a lot to manage the logistics from outside of their building to their dock, but they’ve done very little to manage the logistics inside of their organization,”

“So when a nurse is waiting for something, or they’re trying to get a room turned over, or a patient is waiting for a meal, or they’re trying to get medication to a patient, you’re in this indoor city and you’re trying to move all this stuff around.

Multiple sensors, including sonar, infrared and a SICK laser scanner, help it navigate autonomously and safely among its biped coworkers and hospital visitors.

It freely navigates hospital corridors and service areas without needing to follow magnets or painted stripes on the floor, or requiring any special infrastructure.

It’s entirely another thing for the robot to be able to control elevator systems, or to respond to alarm conditions, or to be able to have two-way communication with the users.

Or to have a fleet of robots that are interchangeable among departments, sharing duties and managing when those robots are dispatched, what their next job is, and whether they are dedicated to a certain department at certain times of the day.”

“One of the things that has allowed us to manage that type of customer base and to be successful is our cloud command center, which allows us to stay connected to all the robots that are installed in the customer base.

So if a robot encounters a problem, an algorithm detects the problem and our support technicians here can respond to that condition.

The result being that over 97 percent of all alerts that the robots make to the command center can be resolved remotely without any involvement by the customer on site.”

“We get a signal and we’re able to either address the problem remotely or in the three percent of situations where we can’t deal with it remotely, we can dispatch one of the customer employees to move the bed or move the robot.”

“It’s really a very important part of our technology platform, because to make an autonomous robot commercially viable, meaning it is affordable enough to purchase because it has return on investment, you can’t necessarily build every single recovery mode into the robot that would be necessary in the field.

At initial deployment, Aethon creates a shared, common map of the hospital and a route definition for all the robots in the hospital’s fleet.

Robot Rx President and CEO Aldo Zini, who joined Aethon shortly after it was established, has roots at Automated Healthcare, which developed the first robotic medication dispensing system for hospitals, ROBOT-Rx.

So we inserted hardware and software in that black hole, so that every single step along the way the medication has a chain of custody.

This draws an interesting parallel with the industrial sector where there’s a strong motivation to robotize the dull, dirty and dangerous jobs.

“Part of this is due to running around the hospital with these very large carts pulling very large loads, and sometimes pulling two of them at a time.

Aethon is testing an additional platform with an omnidirectional base with four independently motored wheels that will allow it to pivot on its own axis and move sideways and diagonally, providing for more maneuverability when needed.

The InTouch Vita telemedicine robot (pictured), developed by iRobot and InTouch Health, connects physicians with their patients no matter where they are, thanks to mobile telepresence technology.

From mobile patient care and robotic surgical suites, to rehab arms and blood-drawing bots, healthcare robots come in many forms and form factors.

Robot Phlebotomist Stäubli’s TX40 robot, a smaller version of the TX60 used for hair transplant, is the primary component in the Veebot automated venipuncture system.

This robotic phlebotomist uses infrared sensors to locate an appropriate vein in a patient’s arm, then ultrasound to detect blood flow, and the smooth precision of the robot arm to insert the needle in the right location.

With billions of blood draws performed every year, a robotic solution could help reduce human errors that can lead to patient discomfort, significant time and cost inefficiencies from re-draws, and even the spread of infection.

hospital, has a mobile robot fleet transporting supplies through its corridors, a Stäubli robot dispensing chemotherapy drugs, and a suite of radiology robots.

For a glimpse into the crystal ball, watch this internationally renowned surgeon and researcher discuss the future of surgery, where advanced imaging, simulation, robotics, and artificial intelligence will revolutionize the art and science of medicine for better patient care.

Intelligent Robots: A Feast for the Senses

Advances in vision systems, force and tactile sensors, speech recognition, and even olfactory receptors are creating high-achieving robots able to do things and go places that their predecessors could only dream about.

Machine vision technology, laser scanners, structured-light 3D scanners, and the imaging and mapping software to support them are making their way into more applications, which is opening doors to robots in more industries.

The machine vision market is coming off two consecutive years of double-digit growth and expected to continue that upward trend.

We have enough control of that area that we don’t have to have sensors on the robots and that’s allowed us to penetrate 10 percent of the manufacturing industry.

“If we look at sensor-based robotics, we have the technology in many cases to be able to engineer solutions for picking particular objects and to do it very well.

referring to the e-commerce giant’s recent competition to spur technological advances in automated picking for unstructured environments.

he says, crediting lower costs and higher computing power as the primary drivers in the upsurge in sensor adoption.

It’s coming out of cheap cameras for cell phones, where today you can buy a camera for a cell phone for $8 to $10.

Hoping to bypass the need for tedious camera calibration, they use visual servoing to control the motion of a robot manipulator using visual feedback signals from a vision system.

“One of the things we’ve been working on extensively is the ability to take a vision system and a CAD model, and with those two things, basically build a vision system that will do reliable tracking,”

“Right now, we have a project where we’re using the vision system, in this case calibrated, to get to an accuracy that’s better than 0.1 millimeter, even for very large robots, either for automotive or for aerospace.

Getting those sorts of trays or kits built automatically rather than having it done using manual labor is something that has a lot of interest in electronics, automotive and aerospace.”

“Nobody could believe that we can find objects in three dimensional space in six degrees of freedom just based on an image or a picture of the object.

is in the Cortex Recognition software, a collection of patented visual recognition algorithms developed by the company’s founder and based on the human cognitive ability to recognize objects.

It determines what the object is and where it is in space, in six degrees of freedom (x, y, z offset, as well as the rotation about those axes, Rx, Ry and Rz).”

As demonstrated in this video, this partial visibility recognition makes the system very good at seeing deformable objects, such as chip bags.

The new Cortex Random Picking is an advanced version of Robeye with software and algorithm enhancements to allow for the recognition of randomly placed parts.

“Because we don’t have all the horsepower we need, today it is only used for 2D, 2-1/2D and what we’re calling 4 degrees of freedom guidance, so that’s x, y and z, plus Rz (not full 6 degrees of freedom like Robeye).

In the future, once we refine our algorithm and figure out how to take advantage of the quad core processing power inside of RAIO, we feel we’ll be able to offer six degrees of freedom guidance inside that small package.”

Guidance is needed because the build tolerances of the vehicle allow the position of the nut to float greater than the compliance value of the bolt installer.

Applications range from picking parts out of totes to assemble valves and components for air bags, to a snack food company that wants to randomly pick bags of product from a bin with a mobile robot and load it into distribution boxes.

Meanwhile, KUKA Robotics Corporation’s LBR iiwa is turning heads as it appears to float across the factory floor on its new mobile platform.

SLAM addresses the challenges of a mobile robot building a map of an unknown environment while at the same time navigating that environment using the map.

SLAM has emerged from the research labs and found practical use, most notably in self-driving cars and more recently, a line of robot vacuums by Neato Robotics.

One of those companies is Fetch Robotics, which just secured $20 million in new funding to ramp up the launch and development of its Fetch and Freight system for warehouse logistics automation.

“So if you look at the front of one of the Fetch robots, you will see a little black inverted cone near the bottom (above the gray-colored base).

This video shows the Fetch and Freight system in action with the SICK TiM laser scanner on board guiding each mobile robot independently.

According to this IEEE Spectrum article, the Fetch mobile manipulator is also using a PrimeSense 3D sensor in its head to locate product on shelves and place it in the bin of its faithful sidekick, Freight.

You may have noticed TiM’s heavyweight predecessors from the early days of the DARPA Urban Challenge when 90 percent of the self-driving cars, including the winning teams, had SICK laser scanners on board.

“There are a lot of technologies like stereovision and structured-light cameras that do a good job in ideal conditions, in lab conditions, and I’m really looking forward to seeing what those types of technologies can do in the future,”

“But as they are right now, as soon as you go away from ideal, into low light or even no light, or precipitation or fog, all of a sudden those sensors really start to drop off.

So when a customer designs a robot and they need it to work in all conditions, they find that LIDAR or laser scanning doesn’t have the same issues with environment as some of these other technologies do.”

As the mirror spins, the laser scans a viewing angle between 70 and 360 degrees, effectively creating a “fan”

SICK laser scanners go up to 100 hertz, so that means 100 times per second the mirror makes a full revolution.

“We also do volumetric measurement, so if you’re looking down at a conveyor belt, we can measure the volume of the product traveling underneath it.

Say you have a bin of plastic pellets being filled for an extrusion molder, you can have a laser scanner above the bin to get a complete picture of how full the bin is.

We’ve also seen them used on automotive painting lines, so right before the car goes into the paint booth they can make sure a door, hood, or trunk isn’t open.”

We can monitor hours of operation, so we know when we’re reaching the sensor’s service life and can put out an alert.”

Equipped with sight, robots can pick and place objects in less-structured environments, but touch allows them to manipulate objects with greater precision and sensitivity.

“Without a force sensor or vision, robots are totally dependent on everything being located somewhere in space in a predictable and repeatable manner,”

Robot manufacturers and system integrators covet high-performance force sensors for their material removal, part fitting and assembly applications.

Force sensing, or tactile feedback, covers a wide spectrum of applications, even in space, where developments in six-axis force/torque sensing are helping robotic explorers on Mars.

This video takes you inside the ATI Six-Axis Force/Torque Sensor where silicon strain gages, a proprietary bonding method, and low-noise electronics ensure reliability.

Force Feedback in the Foundry In this video an operator uses a force-feedback haptic controller in conjunction with a force/torque sensor to operate a high-payload industrial robot as a telemanipulator.

Other markets are being explored where allowing the operator to use the functionality of a robot while running it manually in real time will improve productivity and safety.

The haptic interface along with the ability to access the functionality of a robot to move in a plane allows the operator in the cab to “feel”

in Helena, Alabama, and uses tactile and force feedback to allow manual control of industrial robots to cut and grind at virtually any angle, according to Chris Cooper, Vice President of Sales and Marketing.

He says other force sensors rely on significant movement of the transducer to perform a measurement, which means the sensor actually flexes.

There’s a company that uses our sensor to manipulate all of these things inside the cabin of the car and make sure they operate to the manufacturer’s specs.”

This video shows sensor-equipped robots testing the force needed to move the louvers on the vents for a car’s heat and air conditioning system.

The automobile industry also uses robots and force sensors for seat testing, where wear and tear is a major concern.

explains Gore, describing the nuclear fusion target, “and has several different components that are being put together in this automated system using two of our sensors.

In the aerospace industry, demands for greater fuel efficiency and extended service life are requiring ever-tighter component tolerances.

In the quest to reduce emissions and increase fuel savings for airlines, OEMs are racing to improve aerodynamics on gas turbine engine blades by requiring tighter tolerances on leading and trailing edges, making manual blade profiling a thing of the past.

Developed by AVR Aerospace in Montreal, Canada, the automated system is geared to the Maintenance, Repair and Overhaul (MRO) market and uses a FANUC LR Mate robot coupled with an integrated force sensor to re-profile blades and vanes according to original part design.

The process achieves tolerances of plus-minus 37.5 microns, or 1.5 thousands of an inch, a feat only possible with robotic automation.

The blades are supplied to AVR Aerospace’s robotic system with weld repair already complete to build up the surfaces of the leading and trailing edges.

Then the process uses a belt sander to blend the weld smooth with the parent material and re-profile the blade edges within tolerances.

Not only is the shape random, but the curve gives it a complex geometry, so we need the force sensor to compensate in different directions depending on the wear and how we’re removing the material on the airfoil.

The force sensor allows us to change the orientation and keep in contact with the material removal tool (belt sander).”

Armand notes that the system intelligence comes from the adaptive and closed-loop capabilities, which he attributes to the software, designed by the company’s team of engineers, combining data from the force sensor and the inspection system, in this case a Keyence laser scanner.

The vision system takes measurements of the part to determine the geometry and the thickness of the weld, and to compute the parameters to get the blade from its original state to the desired finished result.

Even if we change the path of the robot to fit with the actual geometry of the blade, the robot doesn’t go exactly where it’s commanded.

If that’s not the case, then we’ll go for a second run to remove additional material to bring it closer to the desired tolerances.”

AVR Aerospace first cut its teeth on new OEM blades and vanes, where the profiling process is slightly easier given they start with a known geometry and less variation.

We always knew it was going to be a big step to get from where we were with the new parts to the MRO sector, but it was definitely a worthwhile investment because the market for MRO is huge.”

With the experience the company’s engineers have amassed in the aerospace industry for over 20 years, the systems integrator is excited by the significant prospects of this new frontier for robotic automation.

Primarily designed for explosive ordnance disposal (EOD), the system enables operators to feel remote objects through end-of-arm force sensing and a haptic controller.

“Within the job they’re doing they can make decisions and change strategies based on the forces they are measuring and the direction and magnitude of those forces, just as humans can.

Whether it’s a Baxter, Sawyer, Universal, KUKA, YuMi, they’re all equipped with one form or another of force or torque sensing that allows us to build robots that are safe in the presence of humans.

“We have a robot here at Georgia Tech in Andrea Thomaz’s lab where they basically give it a recipe for something like spaghetti Bolognese and then a human and the robot collaborate on cooking that meal,”

If the human starts chopping the tomatoes, the robot is smart enough to realize that it should pick up the carrots and bring them to him, so when he’s done with the tomatoes, the robot can give him the carrots.

This video shows the Simon humanoid robot learning how to do a task via a combination of speech recognition and lead-through teaching, where a Georgia Tech researcher moves the arms to the desired positions and augments the instruction with verbal commands.

This robotic chef sports two arms from Universal Robots, the movements of which were mapped from 3D camera recordings of a real chef preparing one of his acclaimed recipes.

Robot-sensor combos to detect electrical and magnetic fields, pressure fluctuations, humidity, chemicals and other environmental conditions are forging new frontiers for robotics in mining, construction, agriculture, marine exploration and many other industries ripe for intelligent automation.