AI News, Stanford's 'JediBot' Tries to Kill You With a Foam Sword

Stanford's 'JediBot' Tries to Kill You With a Foam Sword

If there was one bad thing about those lightsaber-wielding robots from Yaskawa that we saw at ICRA, it was that you couldn't bust out your own lightsaber and jump in the middle of the fight.

Now a student project at Stanford has put these two brilliant ideas together and come up with 'JediBot,' a robot arm that will actually try to kill you with a foam sword:


Most people thought that my project was pretty cool (pro tip: if you build something cool, put lots of hi-res photos or CAD screenshots on your poster) and I'm pretty proud of my project this summer.

With only a year left of undergrad, and a lot more schedule freedom than past years, I want to take a lot more in-depth, grad level classes so I can master (or simply learn in greater detail) some of the concepts and things I've learned in undergrad.

Even though I was a lot more autonomous this summer, and a lot of the grad students were gone traveling at various times, I enjoy being part of this community of friendly, brilliant people.

I love the friendly banter about sports, news, or dry-adhesion limits when I arrive in the morning, and I love hanging out at lab barbecues and sharing company over a burger.

Capella shared some cool progress on the electrostatic gecko adhesives, but the real treat was Hao showing his gripper modeling progress with some beautiful MATLAB wire mesh plots of adhesion limits.

The plots I took were a bit messy, and I had to discard a lot of bad trials, so I've decided not to put them on my poster unless I get a some cleaner-looking data tomorrow morning.

The new wrist allowed the gripper to come in at angles of around 30 degrees and adjust to grab a glass slide I was holding.

I found it interesting but unsurprising that the time of flight sensor reads different values off of different surfaces, so for our experiments we'll have to use a consistent surface.

In the afternoon, I set up two of the small free-flying air bearing pads and successfully perched a bunch of times on the granite table in Durand.

Using a 3.7v LiPo, I was able to run the microprocessor but suffered a huge voltage drop (down to ~1.5 V) once I plugged in the protoboard connected to the servo and ToF sensor.

Eventually I tried running the servo off of the motor controller Tinyshield, which would allow me run the motors off a second battery and on a higher current line.

had a great, productive morning implementing the state machine and tuning the servo positions and proximity threshold for robust grasping and release.

I took a good half hour shaking out the cobwebs to sure I write clean, decomposed code that represents the structure of my state machine.

The combination of being outside in the sun, getting my hands dirty and covered in grease, and playing around with some interesting mechanical components was a welcome break from a thoroughly frustrating debugging process.

I then spent a while re-stringing the load tendon pulley, since balancing the tensions so there is an 'active,' a 'neutral,' and a 'released' state is crucial for the desired use.

ran some final pull tests towards the end of the day, and the gripper was consistently pulling between 13.5 and 14.5N, which means that my load sharing mechanism works well since each pair of adhesive pads can only hold a little over 7N.

There are a few details I can work out to make it function a bit cleaner, so I can spend this otherwise idle time working to clean up some of the bugs.

In the afternoon, I met with Alina, an undergrad working in Marco Pavone's lab who is designing and assembling both the host free flier and the target free flier.

I confirmed that gripper and its mount would fit the mounting plate and align with the target, so I turned my focus to improving the load-sharing mechanism within the gripper.

I had been having some issues with the load tendons slipping off the pulleys, so I tried cutting some large diameter washers from fiberglass, but these proved ineffective.

I then switched out the pulleys for some larger ones that barely fit underneath the top plate, and they proved to be a major upgrade over the smaller ones.

The larger pulleys better aligned the load tendon with the center of the adhesive tile pair, and the tight fit between the pulley's edge and the top plate makes it nearly impossible for the string to fall off the pulley.

The larger diameter also meant more surface contact, and once I removed a few washers to account for the wider pulley, the whole shaft had significantly lower rolling friction.

Lab meeting this morning was incredibly interesting, with French visitor Idriss Aberkane giving a fascinating talk about the micro- and macroeconomic implications of the knowledge economy.

He also specifically focused on why bio-inspired design is so valuable, because nature has had 'a four billion year RD cycle' to develop things mankind still cannot.

I had a pretty difficult time tensioning all the necessary tendons so that it would work, but my preliminary tests at the end of the day showed clean grasping, releasing, and what felt like good load sharing.

I will have to devise a more complex test to actually prove load sharing is occuring, but the pulley system appears to be working smoothly and the gripper felt good in various pull off tests I ran.

They were using a pipe that was 8cm in diameter, which is a bit small, so a larger ~10-12 cm pipe will provide an easier perching target.

Since this lower module also holds the quadrotor attachment point, I might also incorporate a compliant wrist to aid in perching on off-center targets.

I figured I could help Aaron take some more hexapedal gait data with my downtime, but they busted a motor early in the day so the robot was out of commission.

I realized that this year, we were fortunate to have a SURI with significant grilling experience (myself), and that cooking meat is somewhat difficult if you don't know what you're doing, so I added some basic grilling guidelines.

We marinated the chicken overnight in a combination of balsamic vinaigrette and barbecue sauce, which is the Kimes family go-to marinade because it's really cheap, requires little effort, and results in sweet, juicy chicken breast.

We were able to crank out another 15 burgers every 10 minutes or so by moving quickly on both grills, and the vast majority of people got served by 12:30, with a few people staying for seconds around 12:45 and a group who came late getting the last of the chicken around 12:55.

The last half hour of cooking was slow, since the coals were staring to cool down and we ran out of reinforcements, but I did a thorough job of making sure all meat that I served was cooked well.

I didn't even suffer any charred casualties from forgetting to flip a chicken breast on time, though I did singe a lot of the hair off my right hand attempting to 'encourage' some cold coals with lighter fluid.

Overall, we had exactly the right amount of meat, and we were able to serve everyone with only 5 cooked chicken breasts left over and about 10 frozen patties to donate to next week's group.

Although I was a bit concerned at the end when our coals started to cool down, I did a better job managing the cooking than we did last year, and the coal grills at Gibbons Grove are a definite upgrade over the PRL gas grills.

Since I was standing over a flaming grill for three hours straight, I have to give a major shout out to Bessie, Aaron, and Vanessa for setting up the tables, hauling supplies over from MERL kitchen, getting me extra lighter fluid when we ran out, and generally making sure the rest of the barbecue ran well.

We found that the robot was pretty skittish on the slick floor, but identified one particularly stable set of gait parameters and took some a nice high-speed video of it.

We got 90 hamburger patties, 60 chicken breasts, 120 buns, and all the necessary burger toppings, as well as some drinks, chips, and cookies.

I plan on putting a comprehensive description in my blog post tomorrow after the BBQ, and updating the MERL BBQ page of our website with important takeaways.

It turns out the bluetooth dongle we were using didn't support Windows 10 (my laptop), so Aaron had to install Matlab on his Windows 8 partition, and Tyler on his Linux partition.

To help with this, I made quick, rough CAD models of the Tinyduino board and the time-of-flight rangefinder that Bessie was using, which I hope to employ as more robust sensing for when and object is grasped.

I struggled a bit with how to implement the actuated load sharing and still be able to pull a release tendon, but I think I found a way at the end of the day.

will be helping Aaron and Tyler take some data for Alice's all terrain based gait experiments, but hopefully I will be able to finish CADing my new and improved load-sharing system tomorrow so I can order any necessary parts from McMaster and begin cutting/assembling.

Aaron started machining the updated brand later that night, and apparently the operation was still running when he came back early Saturday morning.

talked with Matt about my light contact gripper prototypes, and we came to the conclusion that some of my geometric constraints are applying a peeling moment by not acting through the adhesive pads' center of pressure.

I'm going to use some simple load sharing between two tile pars, and will be using the improved tile pair assemblies I made a week ago.

I read back through my blog and realized I hadn't documented my thought process of the past few days really well, so I will attempt to give a fairly complete description of my past three days work on the low contact force gripper.

The Matlab program finds paired sets of four variables: Pad angle theta, string length L1, adhesive load angle alpha, and loaded string angle phi (all labeled above).

Because these constraints represent three different and interdependent inequalities, I've put all of these quantities in terms of the pad angle theta, which has fairly simple geometric constraints bounding it between 0 and 7.44 degrees (condition 1).

I choose to find the maximum allowable loading angle alpha for any geometry set, since these lie more in the adhesive's nominal angle range for normal gripping.

However, given my order of constraints, the maximum loading angle alpha for any pad angle theta will result in no depression angle phi (or a very small angle), which requires a very high tension, and thus a very high normal force to be applied.

This allows for a non-zero value of the depression angle phi, which is important for allowing the adhesives to align to the surface and also to reduce tension in the string L1.

The solution I have chose to simulate in the WorkingModel screenshots above has the following values: theta = 7.0, alpha =7.7, L1 =0.485, phi = 5.87.

We finally got most of the way through the stainless steel stock, but broke our extra reach tool on the last pass because we had to remove one of the parallels to avoid cutting it with the tool.

had a much better day prototyping my new gripper design, and I did some further analysis to try and identify a specific geometry that will meet my design requirements.

Because the gripper works almost entirely due to the subtle geometry of small angles, I will need a high level of precision for further prototyping.

My afternoon analysis found that the gripper has a maximum resting angle of 7.44 degrees for the adhesives, and a maximum load tendon angle of 10.5 degrees.

I was also able to confirm that a solution fitting all of my geometric constraints exists, and I will begin making a more precise prototype tomorrow.

The constraints are a stacking sequence of inequalities, with the range of possible angles or lengths dependent on the previously calculated ranges.

My first prototype was promising, although the string anchors on the ends extended too far, so it only successfully grabbed things that were smaller than its overall width.

I'm nearly certain that remounting the load tendon on the wood backing plate is the cause of this, since it needs to be closer to the adhesive contact plane to load the adhesives in shear.

My idea involves modifying the standard opposed grip adhesive design by adding a small spring or elastic element behind the two pads.

The spring causes the adhesive pads to be titled at a slight angle in their 'open' state, and adding a normal load will makes them come flush to the surface.

The small angles mean that moving from the 'open' to 'closed' positions should require very little force, while the spring will tension the adhesives inward once full contact is made.

He was pretty nervous about machining the 304 stainless, but the tools had no problem getting through the material and the only real issue was that the 1/4' endmill was a bit noisy.

I had a pretty tough time this morning because we ran out of coffee at the house, but I ordered a strong Starbucks drink after lab meeting that took care of that.

I also had an idea for a completely passive, self-locking opposed gripper that would function similar to the collapsing truss mechanism, but for a much lighter contact force range.

I read a whole chapter of Elliott's thesis today on design and performance of opposed grippers, and skimmed through some other related BDML papers from the last few years.

decided that increasing the normal load held by the gripper would require using multiple pad-pairs, so I will need to devise some sort of load-sharing mechanism to add adhesive pad pairs efficiently.

My afternoon was mostly spent doing load-sharing research, and I read the relevant sections of Hao's space gripper paper about four times, then discussed the matter with Arul to clarify some things.

Tomorrow morning I 'm going to write a document for UPenn about how to use/maintain the curved surface gripper I sent them, so that should give me some momentum going into the weekend.

She spent about an hour introducing some of her lab's work in the field of soft robotics, particularly in using a liqui Gallium-Indium mixture to create stretchable, flexible strain sensors.

She had some really cool manufacturing techniques that involved inkjet printing a layer of liquid metal spheres, then breaking the hard oxide shell of spheres in the desired conductive pattern to make very intricate geometries.

After doing research, we settled on 304 stainless, which is food safe and will last longer than carbon steel, although it will be more difficult to machine.

I observed him and Arul discussing the feeds and speeds for the individual machining operations, and I picked up a good deal of machining theory, which I find fascinating.

spent the afternoon playing around with some new gripper designs in an attempt to reduce the contact force required to engage the adhesives.

I was somewhat successful, and managed to create a gripper assembly that would grip under its own weight, but only under ideal conditions and immediately after cleaning.

I discussed the performance of opposed grippers with Arul for awhile, and realized that to get more performance out of the gripper, I needed to a better actual gripper pad assembly.

Some quick tests revealed that the demo gripper used for lab tours-which has very worn adhesive pads-performed better than the gripper pad assembly I was using.

They made pretty solid hamburgers, and the new location on Gibbons grove is an improvement over last year, and the charcoal grills are an upgrade over the old propane grills from the PRL courtyard.

Preliminary tests using the manual servo driver were very promising, so I turned my focus to integrating a Tinyduino and making making the gripper automatically actuate.

I was able to upload and run one of the simple sample programs, and will be beginning to test and wire the limit switches and motor tomorrow.

I was able to suspend three kilograms from the bottom of the gripper while attached to my nalgene, so I am thoroughly convinced the gripper will have no problem with the 500g quadrotor at UPenn.

I spent the afternoon designing and assembling the first wooden prototype of the Astrobee gripper, using a lasercut plywood construction that matches Matt's Astrobee mock up.

Eventually, I laser cut some acrylic pieces and attached them to the carbon structure using 3/4' standoffs and some 4-40 screws.

The whole assembly feels remarkably sturdy, and my initial pull-off tests after depressing the limit switches by hand were very promising, with no adhesive failures occurring if both switches were depressed when I first pulled the load tendon.

I spent a lot of time familiarizing myself with OnShape's mating system as I attempted to get the gripper to behave as close to real life as possible (without flexible parts).

I used limits on the rotational mates sand gearing relations to give the gripper a fairly realistic range of motion.

I spent most of today (Friday) performing pull tests on the adhesive to find the minimum force necessary to create adhesion with one of the prexisting gripper pads.

The added rubber bands prevent the gripper rotating relative to the base plate, and act as a mild rebound spring.

One unanticipated effect is that the long carbon rod flexes a small amount near the edge of its travel, which should alleviate strain on the servo.

I got very consistent triggers and releases with the final foam pieces, and with freshly cleaned adhesives I was able to grip a water bottle so firmly that I could not pull it apart with my bare hands without breaking the large carbon fiber components.

About ten minutes after coming to this conclusion, Justin replied to my email from Friday, and his design requirements confirmed that a rebound spring was unnecessary.

I wrote a long email to Justin at UPenn this morning asking for some updated/more detailed specs for the gripper, so when he gets back to me I can tune the gripper.

The new plates are far stronger, and allow for a small amount of bending along the bottom piece of fiberglass, which keeps the adhesive film taught.

I also think the 10N spring is too stiff for the ~500g quadrotor, so I'll also downgrade to a spring with about 5N of force, which should allow for a nice gentle extension when loaded.

We went over basic requirements, such as quadrotor mass and payload capacity, and he agreed to send a CAD file of their quadrotor's attachment point.

We decided the curved surface gripper had to be sturdy, since in the past the UPenn team has had to ship flat surface grippers back to us for repairs.

The Adept was extremely heavy and had no hand hold spots, so after removing the control computers, it took us about 45 minutes to move it on to a forklift pallet outside.

After lunch, I talked with Hao about possible projects to work on, and it looks like I'll start by building a curved surface gripper for the UPenn guys to use in their VICOM system.

After pressing hard the last two weeks before the 9/15 ICRA deadline, Morgan decided to call off submitting, iterate on the robot, collect updated data, and polish the paper.

At the actual poster session, the BDML posters drew a fair amount of attention, and I got to present to and answer questions from four or five different people.

This means we cannot use these to avoid bad transfer, but if we gently press and slide a foot against the wall, we can detect whether transfer has occured or not (and thus determine whether the pressure normal to the wall was sufficient for climbing).

After looking through a lot of accelerometer data taken during climbing, I found a characteristic shape in the x and z accelerations that could be used to predict good/bad step or load transfer.

Without any on-site re-tuning, it scaled the wall remarkably well, and gave every indication that the clock tower stucco is a much friendlier surface than the bricks we've been climbing in the lab.

Morgan made an awesome modification to the firmware last night that allows Crazyflie to detect when climbing has failed and fire all of the rotors in an attempt to re-perch.

I pulled it together and got through the day by drinking an amount of caffeine that can only be described as 'medically alarming,' but having that project behind me is going to allow me to focus solely on perch/climbing stuff next week.

After making some minor mechanical tweaks and playing around with the servo speed limits today, Morgan and I spent most of the day trying to get past the twitching issue in the servos.

I had though since I programmed them with a 4.8V supply that Crazyflie's 3V supply was not providing enough voltage for a good power, but after reprogramming them at 3V I confirmed both a) the servos are fully functional at a low speed limit and b) 3V is sufficient to power them at that speed limit.

During our perching meeting at lunch, Morgan and I discussed paper-izing the perch-climbing mechanism, and it looks like he'll be helping me do some experiments and such to make that happen before the approaching deadlines.

had a very productive day of prototyping, and finished with a very mechanically complete prototype.In the morning, it became evident that my previous prototype was both too heavy and suffered from some range of motion issues.

I started a rebuild by moving Crazyflie's battery underneath the board.This meant I was able to fit a frame through the space normally occupied by the battery, which brought all of the mechanisms closer to the board.

This means that the robot's center of mass is closer to the wall while climbing, which reduces the peeling moment and is advantageous for maintaining a perch.

The burgers kept creating grease fires on the hot grill, and the cooler grill took forever to make pork.

We had been using code posted on the Crazyflie Wiki for controlling a piezo-electric buzzer to play music, and in the code there was alternating high and low voltage PWM signals--to create a peak to peak of 6v instead of 3-- so there was essentially an extra pin outputting the inverse of the PWM signal used to drive the first servo.

By adding a second parameter to the function call that adjusts the duty cycle, I was able to very cleanly add a second, independent PWM output.

I will probably use the smaller 1.5g servo that David mentioned a week or so ago, since adding a second strong 4g servo would make it difficult to remain under the 15g weight budget.

I do believe that this problem can be solved using a single degree of freedom and a very clever mechanism, but since the summer is winding down, I want to accomplish robust climbing before my time here ends, so adding a DOF is my best bet.

It has an appropriate range of motion and comes pretty close to creating a relative motion profile similar to the ideal relative motion profile I discussed earlier.

However, I made the mechanism out of acrylic instead of delrin since the acrylic cuts and glues better, and the resulting mechanism is relatively 'sticky,' with high sliding friction and frequent jamming.

In addition, kapton flexures play an important part in the mechanism, and these flexures have already begun to wear out and begin jamming.

I will probably experiment with creating a delrin version tomorrow, but since delrin melts slightly when cut, I am skeptical that I will be able to achieve the precision necessary to effectively manufacture the part I need.

My goal today was to finish modifying and improving the lever mechanism enough to convince myself that using a lever or cam was possible, and although what I had by the end of the day did not climb, I was able to get enough range of motion from it that I will continue pursuing this strategy.

My other choice would be to simply add a smaller, weaker servo as a second DOF, but this would require probably a full day or two of firmware hacking.

He suggested that I can actuate both feet with some sort of compound mechanism, and as long as the motion of one relative to the other fulfills the ideal, then the climbing mechanism should work.

It took me pretty much all day to figure out specifically how to assemble and attach the parts so that the lever would not interfere with the track sliders, but it eventually came together.

I finished assembling the slider track I had laser cut yesterday, and then realized I would almost certainly have to CAD and cut the track and the lever together in order to get the level of precision required.

I should finish assembly tomorrow morning, and from there I will be able to get some reading on whether using a lever for the bottom foot is a viable solution.

Towards the end of the day, I used the tiny 401 tube to assemble my laser cut parts, and added a thin kapton flexure to mitigate the 'snapping' effect I had noticed at the bottom of the track.

I also think I can save some weight on the foot track mechanism by slightly re-building, and depending on how tests this afternoon go, will possibly just rebuild the entire mechanism with weight saving in mind (but preserving the same foot profile).

The first issue resulted from snapping the foot between the two tracks, which caused a vibration similar to that caused by the inchworm gait earlier this summer.

When Crazyflie was able to hold on through this vibration (or catch itself) it would lose contact during the second load transfer, when the top foot is push diagonally up and away from the wall.

since the direction of travel is similar to the hook's incident angle to the wall, so the only force holding the hook against the wall is the small static friction between the hook tip and the wall material.

Also, since the lower foot is mounted rigidly to Crazyflie's chassis, moving it into the wall requires overcoming a much larger inertia than moving the top hook away from the wall.

The high speed footage of my tests showed that when the perch failed during this upstroke, Crazyflie had nearly perfect vertical acceleration down, and the upstrokes failed to affect the main chassis' motion at all reminded me of the classic 'inertial tablecloth' demo used by most physics teachers.

Matt suggested a small flexure that could act as a one way gate or valve in the track to mitigate snapping, and I proposed a less passive mechanism to press the bottom foot into the wall when the top foot begins its upstroke.

He liked my presentation slide from last week's lab meeting, and suggested I use delrin instead of acrylic for the linear slides, and that we had some white delrin sheets lying around the lab.

After investigating, I realized that I had actually mistakenly been using delrin, which explains why it was not cutting as cleanly as I expected in the laser cutter (I had been using the '3mm acrylic' settings).

I finished assembling my slider mechanism, and realized that adding the slight angled tip at the edges of each stroke actually made the transfer between tracks worse.

I also added a small rod so that only the end of the slider can deflect, which creates a much larger force than allowing the whole length to gently bend.

Hopefully, changing the angle at which the sliding pin changes tracks should allow me to have a narrower overall track since I will require less deflection to snap the pin from one track to the next.

Ive chosen some 0.020' thick carbon beams that should sit flush with the 0.020' fiberglass, so I've tried to integrate them into the design of the slider back plates.

I'm pushing the 15g payload limit, as this mechanism currently weighs over 14 grams, but I have a few areas in mind that I can definitely downsize.

I will more than likely create a new track with a slightly different profile and smaller overall stroke length, which should reduce the length of the entire mechanism.

I spend a while sanding them down and hacking at them with an X-acto knife and got them to slide a bit better, and eventually tried a light coat of teflon lubricant.

Also, as I predicted last week, the x-direction travel was not enough to cause the foot to 'spring' between the downstroke side and recycling side, so I will create another version with more x-travel.

This increases the chance that the pin will jump out of the track, and also increases the sliding friction (along with the fact that fiberglass doesn't laser cut that cleanly).

I will make more tracks this afternoon with a fiberglass back, and a single layer of thicker acrylic to create the track profile.

I did create the track pieces out of acrylic and they work very well, but midway through the afternoon my laptop crashed and wouldn't start up again after many attempts and many hours.

I made some cool figures that I felt demonstrated some of the problems with our old climbing method and clearly depicted how the new mechanism is supposed to work, so I've included them below.

Each track unit from yesterday weighed in at 1.21g, whereas these4 two track units combined to be 1.11g, since I included a large hole in the middle and reduced the overall size.

I'm also concerned that they do not have enough x-direction travel to cause the foot to 'spring' back and forth between the climbing and recycling tracks, so I might have to remake them with more x-travel so the carbon rod deflects enough.

I had to sand the edges of the cut fiberglass down to remove the ash, but after that the small carbon pin slid pretty smoothly through.

This is actuated by thin linear slides, and the deflection of the linear slides caused the foot to spring between the 'engaged' track while pulling Crazyflie up the wall and the 'disengaged'track while recycling, while Crazyflie is perched on its static foot.

I might investigate to see if the gearbox is damaged and possibly hack a servo for continuous rotation using Crazyflie, which could be useful for this project or future research endeavors.

In the morning, I brought Crazyflie all around campus and tried to find a suitable perching/climbing surface to film a video for the MAST report that is due soon.

After I was unable to find any truly great surfaces on Stanford's buildings, I decided to use the roofing tile that Hao had reinforced with superglue that was lying around the lab.

The tile had an abundance of asperities which made it ideal for perching, but no deep ones that would catch on feet as they slid upwards, which made it ideal for climbing as well.

It utilizes a pin and slide channel to crete the exact foot trajectory that Morgan and I want, and I will cut the channel with the laser, so the foot trajectory will be highly tunable.

Using the high speed camera on my iPhone was really helpful in diagnosing failure and making minor adjustments in the perching maneuver, but my phone's battery died before I could upload these videos.

We also noted the importance of having a very compliant mechanism for perching while maintaining rigidity for climbing, so the next version might feature a fully rigid climbing/perching mechanism that is attached with some sort of foam or other springy material to the Crazyflie frame.

While this method is terribly energy inefficient, the rotor thrust towards the brick was enough to counteract the small vibrations that were causing the feet to lose their hold on the wall.

On each successful climbing step I observed in the high speed footage, the top foot either held on through these vibrations or fell but successfully regrasped lower on the brick.

My new linkage design will likely be based on replicating or emulating the foot trajectory used by spinybot, which pushed the foot into the wall to engage, pulled down in the plane of the wall, then recycled by pulling the foot away from the wall (preventing the non-active foot from interacting with the wall).

Since the engagement is associated with pulling downward, Morgan suggested I use more flexible rods for the top foot, so you can see in the images that I've chosen some truly tiny carbon rods.

Since the sensor was logging forces vs time, I did not establish a Force-Displacement curve, although they felt as if they had a non-linear, decreasing spring constant profile, which could make them useful for load sharing.

Hao and I discussed ways to reduce the mass for this type of gripper, and a relatively simple way would be to replace the thin, vertical 'sheets' of SDM materials (hard plastic and polyurethane) with kapton sheets.

I also observed some of the more intricate foot prototypes created using SDM, which allowed for combining rigid parts with an elastic material to create compliance in each individual foot.

attempted to attach hooks to the kapton feet, but having stayed up pretty late Monday night, I struggled with the fine motor skills necessary to place the hooks onto such a small surface.

When the top feet were pressed in by hand, the servo also had a bit of a hard time pulling the robot's weight up, so I might use a shorter lever arm and less overall range of motion so that it will be able to climb if a clean grasp is accomplished.

My top priority moving forward will be creating feet that have a high success rate for gripping and resist curling over when slid in the anti-preferred direction.

The center of Mass is located almost perfectly in the center (as determined by balancing the Crazyflie on my finger) and weighs in at 38.19g, well shy of the total Crazyflie + payload limit of 43g.

Initial tests placing it on the wall showed reasonable rates of perching success (for rough surface grippers) and climbing under its own power also looked reasonable.

I will now experimentally find the servo control values that correspond to the fully extended and retracted states for the climbing mechanism, and then write a simple climbing script to test climbing (assuming a successful perch).

The feet featured a similar linear design to the previous iterations, but the tail was connected by a slack tendon that is tensioned when the upper foot is fully extended.

Compared to the kevlar thread tendons I was using before, the kapton tendons also create a spring-like forward force in compression in addition to holding tension well.

To accomplish this, I will keep the feet close to each other while executing the inchworm gait and utilize a long carbon tail to counter the peeling moment.

spent today constructing and testing the latest prototype, which featured a very low gripper below the center of mass and a very high one actuated.

Because of my problems with rotor space yesterday, I spent the morning creating a sturdy, minimally invasive frame on top of the Crazyflie that sat right above the rotor plane.

Near the end of the day, I talked with Elliot about his inchworm-gait climber that utilized the gecko adhesive, and he revealed that he used a small spring that caused each foot to 'activate' when appropriate and suggested I try something similar, with a long tail.

I also spoke with Matt about the transfer of gripper between feet, and he said that using a passive mechanism such as a ramp to press the foot into the wall for activation would probably work.

The more compact design featured a shorter gait length and used the servo arm as a tail to press the upper foot into the wall, as shown in the pictures below.

The grippers were also located closer to the robot's center of mass, so the center of mass would be closer to the wall to create a reduced peeling moment.

Moving the grasping mechanisms closer to the center of mass caused everything to be uncomfortably close to the rotors, and introduced the need for a tail that would press the grippers into the wall.

Today was lab cleaning, so I spent the morning organizing one of the lab's central workbenches and the stockpile of materials (acrylic, fiberglass, foam etc) underneath it.

After looking at a spine-based gripper Hao had made, I attached kevlar tendons to the backs of the kapton ''fingers' and added a foam layer between the gripper and chassis to create some compliance.

After looking through the source code together for about an hour and looking at pin outputs on the oscilloscope, Morgan and I figured out how to command the frequency and pulse width and established an appropriate range for operating the small servo.

Currently, the latching mechanism uses the servo to 'peel back' the top gecko adhesive pad, so this peeling propagates until the Crazyflie falls off and rights itself in flight.

Video Friday: Robot Sword Fights, MIT Basement Racing, and RoboGames

We propose a sword-fighting robot system controlled by a stereo high-speed vision system as an example of human-robot dynamic interaction systems.

The purpose of this study was to achieve kendama motion by estimating the object to be grasped based on a high-speed vision system and CoP tactile sensors.

Each primitive uses visual and force information, a physical model of a paper sheet for analyzing its deformation, a machine learning method for predicting its future state.

don't know what passes for “education” at MIT these days, because “learning stuff” now involves teaching autonomous R/C cars to race around basement hallways:

To perceive its motion and the local environment, the robot was outfitted with a heterogeneous set of sensors, including a scanning laser range finder, camera, inertial measurement unit, and visual odometer.

Students integrated existing software modules, such as drivers for reading sensor data, alongside their custom algorithms to rapidly compose a complete autonomous system.

Amazon Picking Challenge ] In the Netherlands, it’s traditional to roast the first asparagus of the season underneath the flaming carcass of the delivery drone that was supposed to drop it off at a fancy restaurant:

With the introduction of YuMi, the world’s first truly collaborative dual-arm industrial robot, ABB Robotics is once again pushing the boundaries of what robotic automation will look like in the future and how it will fundamentally alter the types of industrial processes that can be automated with robots.

This groundbreaking solution is the result of years of research and development, heralding a new era of robotic coworkers that are able to work side-by-side on the same tasks as humans while still ensuring the safety of those around it.

While YuMi was specifically designed to meet the flexible and agile production needs required in the consumer electronics industry, it has equal application in any small parts assembly environment thanks to its dual arms, flexible hands, universal parts feeding system, camera-based part location, lead-through programming, and state-of-the-art precise motion control.

Tech United Eindhoven is preparing for a robotic soccer competition in Portugal.This (very short) vid is worth watching just for the last little bit at the end, where a few humans take on the robots.

Tech United Eindhoven ] To celebrate the Opportunity Mars rover having traveled an entire marathon on the red planet (26.2 miles over 11ish years), JPL decided to host their own marathon, to be run in a slightly shorter amount of time.And you'll never guess who wins:

Applications include: compliant behavior for the whole robot, energy-based limit cycle controller for hand shaking, walking on flat ground, balancing on a rockerboard, and multi-contact whole-body control.

TEDx ] Last this week:UT Austin robotics professor Luis Sentis visited Georgia Tech a few weeks ago to give a talk on Humanoids of the Future.The video is a little, er, trippy, but if you love robotics enough, you won’t care:

Motion capture

In filmmaking and video game development, it refers to recording actions of human actors, and using that information to animate digital character models in 2D or 3D computer animation.[3][4][5] When it includes face and fingers or captures subtle expressions, it is often referred to as performance capture.[6] In many fields, motion capture is sometimes called motion tracking, but in filmmaking and games, motion tracking usually refers more to match moving.

Motion capture offers several advantages over traditional computer animation of a 3D model: Video games often use motion capture to animate athletes, martial artists, and other in-game characters.[9][10] This has been done since the Sega Model 2 arcade game Virtua Fighter 2 in 1994.[11] By mid-1995 the use of motion capture in video game development had become commonplace, and developer/publisher Acclaim Entertainment had gone so far as to have its own in-house motion capture studio built into its headquarters.[10] Namco's 1995 arcade game Soul Edge used passive optical system markers for motion capture.[12] Movies use motion capture for CG effects, in some cases replacing traditional cel animation, and for completely computer-generated creatures, such as Gollum, The Mummy, King Kong, Davy Jones from Pirates of the Caribbean, the Na'vi from the film Avatar, and Clu from Tron: Legacy.

This method streamed the actions of actor Andy Serkis into the computer generated skin of Gollum / Smeagol as it was being performed.[13] Out of the three nominees for the 2006 Academy Award for Best Animated Feature, two of the nominees (Monster House and the winner Happy Feet) used motion capture, and only Disney·Pixar's Cars was animated without motion capture.

Since 2001, motion capture is being used extensively to produce films which attempt to simulate or approximate the look of live-action cinema, with nearly photorealistic digital character models.

During the filming of James Cameron's Avatar all of the scenes involving this process were directed in realtime using Autodesk Motion Builder software to render a screen image which allowed the director and the actor to see what they would look like in the movie, making it easier to direct the movie as it would be seen by the viewer.

FaceRig software uses facial recognition technology from ULSee.Inc to map a player's facial expressions and the body tracking technology from Perception Neuron to map the body movement onto a 3D or 2D character's motion onscreen.[14][15] During Game Developers Conference 2016 in San Francisco Epic Games demonstrated full-body motion capture live in Unreal Engine.

Motion tracking or motion capture started as a photogrammetric analysis tool in biomechanics research in the 1970s and 1980s, and expanded into education, training, sports and recently computer animation for television, cinema, and video games as the technology matured.

Newer hybrid systems are combining inertial sensors with optical sensors to reduce occlusion, increase the number of users and improve the ability to track without having to manually clean up data[citation needed].

Unlike active marker systems and magnetic systems, passive systems do not require the user to wear wires or electronic equipment.[17] Instead, hundreds of rubber balls are attached with reflective tape, which needs to be replaced periodically.

This type of system can capture large numbers of markers at frame rates usually around 120 to 160 fps although by lowering the resolution and tracking a smaller region of interest they can track as high as 10000 fps.

The TV series Stargate SG1 produced episodes using an active optical system for the VFX allowing the actor to walk around props that would make motion capture difficult for other non-active optical systems.[citation needed] ILM used active markers in Van Helsing to allow capture of Dracula's flying brides on very large sets similar to Weta's use of active markers in Rise of the Planet of the Apes.

The power to each marker can be provided sequentially in phase with the capture system providing a unique identification of each marker for a given capture frame at a cost to the resultant frame rate.

One of the earliest active marker systems in the 1980s was a hybrid passive-active mocap system with rotating mirrors and colored glass reflective markers and which used masked linear array detectors.

LEDs with onboard processing and a radio synchronization allow motion capture outdoors in direct sunlight, while capturing at 120 to 960 frames per second due to a high speed electronic shutter.

This higher accuracy and resolution requires more processing than passive technologies, but the additional processing is done at the camera to improve resolution via a subpixel or centroid processing, providing both high resolution and high speed.

The infrared underwater cameras comes with a cyan light strobe instead of the typical IR light—for minimum falloff under water and the high-speed-cameras cone with an LED light or with the option of using image processing.

ESC entertainment, a subsidiary of Warner Brothers Pictures created specially to enable virtual cinematography, including photorealistic digital look-alikes for filming The Matrix Reloaded and The Matrix Revolutions movies, used a technique called Universal Capture that utilized 7 camera setup and the tracking the optical flow of all pixels over all the 2-D planes of the cameras for motion, gesture and facial expression capture leading to photorealistic results.

Optical tracking systems are also used to identify known spacecraft and space debris despite the fact that it has a disadvantage over radar in that the objects must be reflecting or emitting sufficient light.[18] An optical tracking system typically consists of three subsystems: the optical imaging system, the mechanical tracking platform and the tracking computer.

The dynamics of the mechanical tracking platform combined with the optical imaging system determines the tracking system's ability to keep the lock on a target that changes speed rapidly.

The tracking computer is responsible for capturing the images from the optical imaging system, analyzing the image to extract target position and controlling the mechanical tracking platform to follow the target.

Inertial motion capture systems capture the full six degrees of freedom body motion of a human in real-time and can give limited direction information if they include a magnetic bearing sensor, although these are much lower resolution and susceptible to electromagnetic noise.

Magnetic systems calculate position and orientation by the relative magnetic flux of three orthogonal coils on both the transmitter and each receiver.[21] The relative intensity of the voltage or current of the three coils allows these systems to calculate both range and orientation by meticulously mapping the tracking volume.

one on upper arm and one on lower arm for elbow position and angle.[citation needed] The markers are not occluded by nonmetallic objects but are susceptible to magnetic and electrical interference from metal objects in the environment, like rebar (steel reinforcing bars in concrete) or wiring, which affect the magnetic field, and electrical sources such as monitors, lights, cables and computers.

The wiring from the sensors tends to preclude extreme performance movements.[21] With magnetic systems, it is possible to monitor the results of a motion capture session in real time.[21] The capture volumes for magnetic systems are dramatically smaller than they are for optical systems.

Most traditional motion capture hardware vendors provide for some type of low resolution facial capture utilizing anywhere from 32 to 300 markers with either an active or passive marker system.

High fidelity facial motion capture, also known as performance capture, is the next generation of fidelity and is utilized to record the more complex movements in a human face in order to capture higher degrees of emotion.

Facial capture is currently arranging itself in several distinct camps, including traditional motion capture data, blend shaped based solutions, capturing the actual topology of an actor's face, and proprietary systems.

The two main techniques are stationary systems with an array of cameras capturing the facial expressions from multiple angles and using software such as the stereo mesh solver from OpenCV to create a 3D surface mesh, or to use light arrays as well to calculate the surface normals from the variance in brightness as the light source, camera position or both are changed.

The speed of light is 30 centimeters per nanosecond (billionth of a second), so a 10 gigahertz (billion cycles per second) RF signal enables an accuracy of about 3 centimeters.

An alternative approach was developed where the actor is given an unlimited walking area through the use of a rotating sphere, similar to a hamster ball, which contains internal sensors recording the angular movements, removing the need for external cameras and other equipment.

Specifically, his research focuses on engineering humanoid robot systems able to predict, act and learn from human demonstration and sensorimotor experience.

The talk will address the topic of mediated embodiment in virtual reality and humanoid robots and its consequences on transforming people’s attitudes and behaviors.

In the current state of the art, embodiment in artificial and digital bodies is mostly studied in the form of avatar in virtual reality.

In her current project, Dr. Aymerich-Franch applies theories and approaches from this field to study similar patterns in the field of humanoid robots.

In connection to this, the presenter will also describe the possibilities of using virtual reality and robots as methodological tools to explore social and psychological phenomena.

She will present the most interesting findings from her work at the different labs she has worked and will use her studies as example cases of how communication technologies can be used by researchers as innovative methods to explore phenomena that was not possible to study before the advancements of these new technologies.

His research interest is in three-dimensional multi-body modelling of the human musculoskeletal system applied to joint pathologies, postural and gait impairments.

Abstract: In this talk, we will describe an automated rehabilitation system for on-line measurement and analysis of human movement that can be used to provide real-time feedback to patients during the performance of rehabilitation exercises.  The system consists of wearable IMU sensors attached to the patient’s limbs.  The IMU data is processed to estimate joint positions.  We will describe an approach to improve the accuracy of pose estimation via on-line learning of the dynamic model of the movement, customized to each patient.  Next, the pose data is segmented into exercise segments, using a novel approach based on two class segmentation.

 The pose and segmentation data is visualized in a user interface, allowing the patient to simultaneously view their own movement overlaid with an animation of the ideal movement.  We will present results demonstrating the significant benefits of feedback based on a user study with patients undergoing rehabilitation following hip and knee replacement surgery.