AI News, We're at IROS 2017 to Bring You the Most Exciting Robotics Research From Around the World
- On Tuesday, February 13, 2018
- By Read More
We're at IROS 2017 to Bring You the Most Exciting Robotics Research From Around the World
There are over 1,200 papers being presented this year, meaning that we have a packed schedule over the next three days, trying (as we always do) to attend over a hundred talks plus posters and competitions and keynotes and plenaries and forums andthe expo even though it’s physically impossible to do everything.
We’ll happily die trying, though, and if there are specific things you’re interested in (check out the conference website to see what’s going on), leave a comment or ping us at @BotJunkie and @AutomatonBlog on Twitter and we’ll do our absolute best for you.
Many other lab members pitched in to gather supplies, create signage, and arrange technical tours for the conference as well.
We went out to breakfast this morning to celebrate all the lab members graduating this May. Everyone agreed that Sabrina's Cafe in University City was the perfect restaurant for such an outing;
This is a $10,000 tuition credit to recognize his excellent performance in classes here at Penn and to enable him to do research.
Many thanks to Rebecca and everyone else for helping put together great media like this.
As you can see in the photos above, many of our lab members attended the reception to celebrate this great accomplishment.
Their tutorial included hands-on activities with chopsticks and texture samples, plus three hands-on demonstrations of our texture rendering technology.
Alex showed a hands-on demonstration of our modular tactile motion guidance system all days of the conference.
We took a photo with all past, present, and future lab members (not counting the dinosaur).
Below you can see Heather showing her texture rendering demo to a group of girls, alongside Rebecca and Liz showing their demo to a conference attendee.
She was one of just 24 students selected from 66 applicants to participate in activities that will thoroughly introduce her to research in human-robot interaction.
The article is teased on the front cover (bottom left corner), mentioned on page 3 with the title 'Introducing Touch to the Computer World', and mentioned on page 4 in relation to the theme of the elderly.
Vijay Kumar, Dan Lee, and Katherine Kuchenbecker participated in a live radio show today from 10 to 11 a.m. Eastern, as you can see in the photo below:
We received 158 submissions (an 8% increase from 2012): 140 papers and 18 extended abstracts.
Another 67 paper and abstract submissions were accepted for poster presentation, for an overall acceptance rate of 68%.
Decisions for oral versus poster acceptance were based on quality, impact and audience interest in the research, as well as quality and appropriate length of the submission itself.
as you flex your elbows and knees, the software generates vibration signals built from recordings from a real robot, which are played as sounds and as tactile vibrations.
The emerging technologies demo was a game where the user played a robot and fought an evil red dragon.
This paper was on the research he did while he visited our lab a year ago, which lets a user experience how it might feel to be a robot.
The main project they focused on was our work on enabling Graspy, our PR2 humanoid, to learn the meaning of various haptic adjectives;
We found a lot of peculiar items and old projects, and we enjoyed putting them in their place (sometimes the garbage can).
Lab members gathered after the main ceremony to commemorate the occasion by taking the above photo.
The paper is entitled 'Using robotic exploratory procedures to learn the meaning of haptic adjectives,' and the authors are Vivian Chu, Ian McMahon, Lorenzo Riano, Craig G.
Professor Kuchenbecker and lab member Ted Gomez worked with them over the last year to create a tool that helps surgeons safely use monopolar electrocautery during laparoscopic surgery.
As you can see in the photo below, the first station was a large television playing the videos and vibrations recorded during a variety of surgical practice tasks as well as real robotic surgery, and the second station was our da Vinci Standard equipped with VerroTouch.
We also had posters summarizing many of our recent research results and plenty of lab members on hand to explain the technology and answer questions.
Every person we spoke to was impressed with the quality of the feedback and excited to see this get integrated into actual clinical systems in the future.
Many novice and experienced robotic surgeons got to try it, as did some representatives from Intuitive Surgical - the last one walked away saying it was FABULOUS.
Professor Kuchenbecker also gave a talk about our technology on Friday afternoon in a session on the future of surgery and got a very positive response.
The following speaker, the esteemed Dr. Jacques Marescaux, even gave us a shout-out in his talk, saying he tried our demo and it was PERFECT.
The team managed to prep our whole booth in record time and even secured IRB approval for the study we ran.
Even more, the da Vinci was severely damaged during shipping or unloading, with the green arm rendered completely dysfunctional and the system unable to boot.
Luckily, kind individuals at Intuitive Surgical arranged to have the robot fixed on the spot by obtaining and installing a new robot arm for free so we could run our demo on the second and third days of the conference.
It was especially satisfying to hear people's reactions when they felt how good our virtual textures are!
Ben Goodman was a co-author on the demo because he was the one who got the haptic camera working, and Joe Romano is an honorary co-author because he wrote all the original software for the TexturePad.
It wasn't selected for the award, but it was still a great honor to be chosen as a finalist for this highly prestigious award.
We heard the wonderful news that three of our lab's current/past members just won NSF graduate research fellowships!
Winning an NSF fellowship is an amazing honor that gives you three years of funding to explore the research that most interests you during graduate school.
Lab members surprised KJK by hanging a lovely sign on her office door, papering it, and having everyone sign it to convey their congratulations.
This is an excellent accomplishment, especially as only 32 total papers are given talks (the rest are presented as posters).
This year's challenge was called 'The Senior Solution,' aimed at getting the kids to think of ways to use technology to better the lives of older adults.
The Posies invented the TeleTouch, a LEGO device that allows two people to pat each other's hand during chat, adding the missing dimension of touch to digital person-to-person interaction.
They demonstrated their invention and performed their TeleTouch skit, which included many facts about the health and emotional benefits of human touch (and included dancing to 'I want to hold your hand').
We showed them our Intuitive da Vinci robot with VerroTouch vibrotactile haptic feedback, and they all got to try VerroTeach, our dental training system.
She had to leave shortly thereafter to keep collecting data in her current study, but that didn't dull the sweetness of her victory.
All PhD students in Mechanical Engineering and Applied Mechanics are required to give a seminar like this, to let them practice their public speaking skills and to tell the Penn research community about the cool work they are doing.
Will's presentation covered the development and testing of systems that measure and feed back tactile vibration signals as an indicator of tool vibrations, both on the bench-top and in the real operating room.
The title is In vivo validation of a system for haptic feedback of tool vibrations in robotic surgery, and the authors are Karlin Bark, William McMahan, Austin Remington, Jamie Gewirtz, Alexei Wedmid, David I.
This paper was a long time in the making, and it includes a beautiful supplementary video showing video and tool vibration signals from the in vivo experiment.
Excitingly, Joe's haptic texture rendering paper is included and even featured on the cover, as you can see in the photo at right.
The reported research formed a substantial chunk of Joe's doctoral dissertation, and it is our lab's second article in this great journal.
Our lab had a barbecue today at Professor Kuchenbecker's house to celebrate all the lab member graduations happening this May. Joe is earning his doctorate (the first one from our lab), and Charlotte, Craig, and Frank are all finishing their undergrad degrees.
It was fascinating to learn about the work Professor Taylor and his team have been doing on robotic systems to enable surgeons to operate on back of the retina in the human eye!
The video shows Professor Kuchenbecker talking about the project and the potential uses for this technology - you can also see Heather's hand using the TexturePad, Robert demonstrating VerroTeach, Ben showing the haptic camera, and a bunch of lab members during lab meeting.
Even cooler is that Penn is currently featuring this video as a multimedia news item on the main university homepage!
He referenced our EuroHaptics 2010 paper and showed images of VT from it and said it was one of his favorite examples of haptics in robotic surgery, saying 'you can actually feel it when two tools hit one another!'
VerroTouch also got a shout out at the IMSH conference Karlin attended last week, where it was mention by Dr. Bosseau Murray, the anesthesiologist collaborating with us on haptography for medical training and simulation.
This means we are going to receive funding to host eight undergraduate researchers in GRASP each summer for the next three years.
Today Joe and Kyle led a PR2 training workshop for other folks in our group who will be working with this great robot as part of our lab's participation in the DARPA BOLT Activity E project we are on with UC Berkeley.
Many thanks to Joe and Kyle for taking the time to transfer their insights and tips to newer members of the group.
And big congratulations to Kyle for taking a new job as a robotics engineer at Barrett Technology, Inc., the maker of the WAM robot arm and the Barrett hand.
He set up his low-hanging obstacle course in the Towne basement, and Suzanne practiced navigating through it with the vibration alerts delivered by the HALO cane attachment.
Suzanne had a great time learning how to use this new sensory feedback, and she had some good suggestions on how to make the next prototype even better.
As you can see in the photo at right, he taught us all the steps of how to turn on the robot, install the camera, install tools, and shut it down.
We are working with QinetiQ to add haptic feedback to the control of mobile manipulator robots in field applications.
The best part of the visit was testing our current prototype in the QinetiQ robot proving grounds, which includes a wide variety of ground surfaces and obstacles to really test the operator's capabilities during teleoperation.
As you can see at right, our whole VerroTouch team met together today to lay out our strategy for the next nine months of research.
This project is generously funded by a Coulter Foundation Translational Research Award, and Coulter's goal for us is to start a company or license our technology to one or more companies.
He presented a wide range of work that his group has done on using instrumented haptic interfaces and custom rendering algorithms to modify how an object feels as you touch it with a tool.
For example, they can make a soft sponge feel very stiff, and they can make a tennis ball feel soft like a sponge.
To help with our research on robotic surgery, we are the proud new owners of an Intuitive da Vinci Standard robotic surgery system!
She started by asking the audience to think of a tricky manual task they completed recently, then emphasizing the importance of the sense of touch in all the manipulation tasks humans excel at.
She then showed a slide of all the major projects our lab has worked on over the last four years (left image above).
Given the strict five-minute time limit, she chose to highlight three specific projects: Haptography, VerroTouch, and Tactile Grasping.
She ended by briefly acknowledging all the great students in the Penn Haptics Lab and showing their photos (right image above)!
Alex takes inspiration from child development psychology to enable robots to learn to perform tasks correctly, rather than carefully pre-programming the robot for each task on its own.
This research cuts across many areas relevant to GRASP, including visual perception, machine learning, artificial intelligence, manipulation, haptic perception, and more.
His talk that was especially great because it included many videos of his own young son exploring and learning to manipulate a huge variety of objects in the world.
Applying the developmental approach to robotics certainly seems like a promising way to move us closer to having a robot in every home.
It shows lots of the cool demos present at the conference, and the star of the show is our very own Joe Romano and our PR2 demo!
The video shows the robot giving hugs, handing out business cards, and giving many high-fives and fist-bumps.
Professor Kuchenbecker titled her talk 'Touching Reality,' and she gave an overview of the haptics research going on in our lab, with details on VerroTouch, Haptography, StrokeSleeve, and Tactile Grasping.
The Urology Times just published a nice news article summarizing our group's recent work on VerroTouch, which adds vibrotactile and audio feedback of tool vibrations to robotic surgery systems.
The main finding from our large study was that surgeons preferred to have this type of feedback available when doing certain manipulation tasks.
Starting at 2:23, the video shows our demo shaking hands, giving out a business card, and then doing high-fives and a fist-bump.
They show the PR2 taking a ball, giving out a business card, and calibrating the Kinect with a new user.
Please join me in congratulating everyone for our lab's monumental recent accomplishment - six conference papers submitted in the last week!
All of the authors listed above worked very hard on these papers and all the research that preceded the writing process.
I also appreciate the work that others in the lab put in to help with these projects and papers, by talking with the authors, being a subject in a study, taking photographs of hardware, or helping edit the text.
Karlin wins a $60,000 grant to support her research on low-cost rehabilitation systems for stroke patients.
As you can see in the above left image, a lot people attended the seminar, and they asked some good questions at the end.
KJK ordered meat and side dishes from Baby Blues Barbecue, and many people brought drinks, appetizers, salads, and desserts.
It was a great way to wrap up a good summer of research before everyone sits down to write their Haptics Symposium or ICRA paper!
The writer interviewed Professor Kuchenbecker on the role that design and innovation play in our lab's work on haptic interfaces.
They also discussed the recent PopTech Science Fellows training program that KJK participated in helped facilitate collaboration and idea exchange between young innovators from a wide range of areas.
In a lovely event at the Sheraton Hotel, Charlotte Rivera showed a poster and gave an oral presentation about her ten-week research project in the haptics lab, entitled 'Vibrotactile and Auditory Feedback for Robotic Minimally Invasive Surgery.'
She did a fantastic job and won first prize in the oral presentation category, which comes with a $100 award.
We started off with group meeting from 10:30am to noon, where we celebrated KJK's upcoming birthday with cake and singing.
We also had a bunch of announcements plus heated discussions about what the lab computer password ought to be.
Some highlights were cleaning out and defrosting the fridge, cleaning keyboards, sorting surface-mount electronic components, and finding random items that might be potentially valuable.
We also talked about how cell cell phones vibrate and how to use haptics to make movies and games more immersive.
Here's a portrait of this group of visitors along with their chaperones and the haptics lab students who helped run the tour.
The second tour was for the twenty-five participants in the Penn IRCS Undergraduate Summer Workshop on Cognitive Science and Cognitive Neuroscience, plus some of the students who have recently joined our research lab.
Professor Kuchenbecker gave a 90-minute presentation on the sense of touch, with some quick hands-on experiments, and then the group came over to engineering to see a bunch of demos.
The title of the paper is 'Tool contact acceleration feedback for telerobotic surgery,' and the full author list is William McMahan, Jamie Gewirtz, Dorsey Standish, Paul Martin, Jacquelyn Kunkel, Magalie Lilavois, Alexei Wedmid, David I.
Our poster was selected as the best poster in the session, so it's now located in a large room with all the other best posters.
Many students who have worked part-time in the lab and/or taken haptics class graduated with a bachelors and/or a masters degree.
The talk was well received, with interesting questions about how the approach could be adapted to other robotic platforms beyond the PR2.
This is a great honor for the often invisible duty of recruiting reviewers and making acceptance/rejection recommendations for papers.
The L�Or�al USA Fellowships for Women in Science program is a national awards program that annually recognizes and rewards five U.S.-based women researchers at the beginning of their scientific careers.
The top candidates chosen by the review panel are then forwarded for final selection to a distinguished Jury of career scientists and former L�OR�AL-UNESCO For Women in Science Laureates.
The Jury seeks candidates with exceptional academic records and intellectual merit, clearly articulated research proposals with the potential for scientific advancement and outstanding letters of recommendation from advisers.
He tried out all the demos, interviewed the students in the class, and talked with Professor Kuchenbecker about the field of haptics and other work going on in our lab.
All twenty-eight students in KJK's MEAM 625 class on haptic interfaces presented hands-on demonstrations of their course projects.
We were happy to be visited by several members of the local press, lots of folks from the Penn community, and thirty young members of local First Lego League teams with their parents.
Great work by all the haptics class students and everyone who helped make this event such a roaring success!
As you can see in the photo at right, G�nter gave a GRASP seminar entitled 'Model-Mediated Telerobotics', where he presented compelling arguments against just sending positions and forces over the communication channel between master and slave robots.
Instead, he argued that the slave should fit a simple model to the contact it is experiencing and send the parameters of that model, so that the local master controller can render interactions with the model at a high rate.
There was great attendance (including our whole research lab and everyone from MEAM 625), and there were many good questions during and after the talk.
After applauding the speaker, we adjourned to a delicious naked burrito bar from Qdoba Catering (a much appreciated change from the standard pizza lunch).
She played on the Stanford women's varsity volleyball team as an undergrad, winning Division 1 NCAA championships in 1996 and 1997 and Pac-10 league championships in 1996, 1997, and 1998.
The photo accompanying the article shows Professor Kuchenbecker and a bunch of students from her MEAM 625 class in Spring of 2009 doing a team cheer after the class's public open house.
Here is a quick list of the highlights: the original PopSci article, the Daily Pennsylvanian article with quotes by Andrew Stanley and others, Stanford Women's Volleyball coverage of the award, and the Penn Current interview with KJK about her teaching, research, and the award.
In addition to a full day of meetings with various GRASPees, she joined KJK's MEAM 625 graduate class on haptics for a discussion of her recent Presence paper, Expertise-Based Performance Measures in a Virtual Training Environment.
It was great to have an author there to answer questions and give behind-the-scenes insights on how the work progressed and what has been done since publication.
Maggie presented her research on the VerroTouch project, focusing on the surgeon study that we ran at the end of the summer.
Their team includes seven boys between the ages of six and nine, and their current challenge focuses on biomedical engineering.
We started by having Joe Romano bring out the Willow Garage PR2 robot, running his high-five code, so that the kids could exchange high-fives and fist-bumps with the robot.
Then we went down to the haptics lab and tried out Will McMahan's Omni teleoperation demo, which showcases the benefit of tactile feedback of tool vibrations.
They all agreed that the surface on the right felt much more real, and they were shocked to discover that the difference was all in the robot controller - they thought the two pieces of wood were actually different.
Then we talked about the Intuitive da Vinci robot as a teleoperation robot used for surgery, and how our lab has developed the VerroTouch system to add in feedback of tool contact vibrations.
They really liked feeling the objects through the laparoscopic tool, and they were able to figure out why the system goes unstable when you touch the tool and handle together (feedback).
Next, we showed them the TexturePad, which lets you feel bumpy surfaces and strong haptic event cues on the surface of a Wacom tablet.
It was all about Tactile Sensing for the PR2 - enabling the robot to delicately-yet-firmly pick up unknown objects, detect unexpected collisions with the environment, and exchange high fives and fist bumps with humans.
Today we did a lab tour for the class students, showing them demos of seven of our research systems: Omni teleoperation, VerroTouch, body-grounded tactile actuators, StrokeSleeve, Texture Pad, Omni virtual environments, and the Tactile Gaming Vest.
Our group shows up many times, including Professor Kuchenbecker's NSF CAREER award, Joe's best teaser award at Haptics Symposium, and all the great undergraduate researchers we've had working with us for the last year.
Many Penn alumni and friends have contacted us after reading this article, and we're excited about the directions those new connections may take our work.
After loading up our plates with brisket, ribs, chicken, potato salad, and macaroni and cheese, we basked in the late-afternoon haze in Andrew's concrete backyard, on chairs KJK donated for the occasion.
Then we enjoyed delicious homemade cupcakes, lemon squares, brownies, and chocolate-covered Oreo cookies while wondering if we ever needed to eat again.
The Penn Haptics group ran two fun design activities for Penn GEMS, a week-long summer camp for local middle-school girls interested in math, science, and engineering.
Each team of three campers had do build a device that could catch a raw chicken egg dropped from a height of six feet.
Their materials were just five popsicle sticks, four rubber bands, three pieces of 8.5' by 11' printer paper, and one foot masking tape.
It was a tough challenge, but all of the teams managed to successfully protect the egg with the first or second device they built.
Their tower needed to be free standing and elevate a small ball as high as possible off the ground.
Teams were given both a racquetball and a lacrosse ball (heavier) to test their designs with, and everyone made two prototypes, making sure to build on the ideas of others.
Everyone built sturdy, tall towers - far higher than they expected they'd be able to at the start of the challenge.
The first Penn Haptics demo was the TexturePad - a stylus-based touch screen with haptic feedback for realistic textures and event-based cues - which KJK presented, with some help from Nils (when he wasn't at his poster).
We're currently working on a user study on this system, and we hope to submit a journal article about it soon.
An accelerometer on the tool measures contact vibrations, and the VerroTouch circuit and actuator allow the participant to feel an amplified and filtered version of these vibrations in their other hand.
We also showed videos of the tasks we're planning to use in the user study this summer, which will be documented in a journal article as well.
The conference is being held at Vrije University in southern Amsterdam, and it includes a day of workshops and then three full days of talks, posters, and demos.
Milos gave a special GRASP seminar entitled Teaching Sensorimotor Skills with Haptic Playback which was highly relevant to many of the projects currently underway in our lab.
Most of the students in our lab attended his talk, and we enjoyed his perspective on motion training, which involves viewing the visual and haptic feedback as inputs to the user model;
Milos uses principles from controls to design feedback signals that help the user reduce the tracking error in the task performance.
Today KJK gave a lecture on haptics for the twenty-five participants in the 2010 IRCS Undergraduate Summer Workshop on Cognitive Science, and then she brought all those students down to our lab for a one hour lab tour with six different hands-on demonstrations.
The visiting students spent about ten minutes at each station, rotating around the lab, trying all the demos, and asking lots of questions.
At the end, we did a debriefing session at the white board where we asked the visitors to share their thoughts on our existing demos and suggest new things we could work on.
Several members of the Haptics Lab spent some time on Hill Field today, helping Professor Dan Lee and the rest of his team set up a robotic obstacle course.
He gave a good introduction to the field of tactile perception and clearly explained five common misconceptions that can interfere with the successful implementation of a tactile feedback system.
Kyle presented his master's thesis work on the design and control of the iTorqU 2.1, and Joe presented the methods he developed for modeling haptic textures from recorded acceleration signals.
The right-hand photo above shows Joe giving an impromptu hands-on demonstration of his TexturePad system, which uses the algorithms from his paper to let the user of a tablet computer feel highly realistic textures.
Mallory Jensen, a graduating senior who worked in the lab last summer, received the Hugo Otto Wolf Award, which is given to the senior in each department who most meets with the approval of the faculty.
He gave a great talk about his research, and the committee was impressed with the work he did, specifically on engineering tactile feedback devices for motion guidance.
Karlin spent the whole day with us, starting with breakfast at 7:30 a.m. and including a whole host of meetings with our research collaborators and other folks here in Penn engineering.
Our tactile feedback for motion guidance project has several common themes with her own work, and it was fun to talk about future directions for research in this area.
The video shows Dr. Kuchenbecker talking about the goals of the TGV project, Saurabh explaining how the vest's solenoids work, Ned commenting on how it feels, and two naive users trying it out.
To celebrate, the GRASP Lab hosted a whole bunch of high school students for a tour and demo session.
To represent the Haptics Lab, Joe Romano gave a demo of our TexturePad system, which provides the feel of highly realistic virtual textures on the surface of a Wacom tablet, as you can see in the lovely photo at right.
on Friday morning to give a talk and show a demonstration of our VerroTouch system, which adds high-frequency acceleration feedback to Intuitive's da Vinci surgical system.
Then she spent Friday afternoon visiting robotics faculty and students at Stanford University, including demonstrations of our Haptography, VerroTouch , and StrokeSleeve projects.
Today she took part in the Stanford Medical Innovation Conference on Medical Robotics, a unique student-run event that involved a wide array of companies, researchers, and professors from the field of medical robotics.
She gave a half-hour talk on High-Fidelity Haptic Feedback for Robotic Surgery in the conference's Medical Innovation Forum, and she showed demos of our lab's Haptography TexturePad and VerroTouch systems during the hands-on demo session, as you can see above.
By the end of the class, they had created a modified version of the TN Games vest that had all custom actuators (solenoids and Peltier elements) and custom software (a mod of Half-Life 2).
Together, they made a fully new vest that includes six solenoid actuators and vibration motors and can work with either our software or the TN Games software.
Anyone interested in such a vest should certainly check out the TN Games 3rd Space Gaming Vest - the Tactile Gaming Vest our lab created is just a research prototype to explore new methods of delivering realistic haptic sensations.
This award is selected by the undergraduate engineering students and recognizes dedication to helping students realize their educational, career and personal goals.
You can see Pulkit and Mallory posing with a conference sign below, because they had already taken down their poster by the time KJK found them to take their picture.
It was great to spend some time together celebrating all of our lab's hard work and recent successes.
Here are photos of all five of our demos, plus a shot of the larger demo room where four of our demos were located.
Each one has a poster and a 45-second teaser presentation, in addition to the talk and poster associated with our two full papers.
Her presentation was entitled 'Haptics: Touch Feedback for Robotic Surgery and More', and it was attended by about fifty women engineers, including undergraduates, graduate students, and professionals.
He gave a fascinating GRASP seminar talk on 'Haptic Guidance Systems', including his work on skin stretch cues and an active hand rest.
Our lab has six 45-second-long teaser presentations (short orals to introduce posters and demos), plus one fifteen-minute talk.
Cam does a lot of great research on medical robotics, and his talk highlighted his Micron project, where they are developing microsurgical instruments that can cancel out the surgeon's hand tremor.
Talking with Cam gave us lots of interesting ideas on other ways smart tools could help a surgeon, and we really appreciate the time he took to visit our lab.
KJK gave an overview of our lab's work, and Vinutha presented her plans for a project on measuring the tool movements surgeons make during suturing.
Our haptography work is funded by this grant, and Prism highlighted the potential for haptic feedback ('good vibrations') to improve robotic surgery and medical simulation.
He regaled the group with stories of projects he has worked on, including MaxiForce traffic control bollards (now licensed to Blue Ember Technologies) and several assistive devices.
After the larger group meeting, George spent some time talking with individual students about their projects, including Pedro, as shown at right.
It was great to have George join us for this meeting, and we look forward to finding ways to involve him in our work in the future.
The first paper is Joe's work on modeling the acceleration waveforms that occur during tool-mediated texture exploration, and the second is Kyle, Joe, and Jamie's work on the final version of the iTorqU.
The program also included a GRASP Lab Open House, and our group presented three hands-on demonstrations with two posters, all centered on our haptography project.
Jamie, Dorsey, Joe, and Will did a great job showing off their work, and several people told KJK how much they enjoyed the demos - very compelling!
Pulkit and Mallory's paper on tactile feedback for motion guidance got a poster presentation, which will be a great way to showcase this system.
The overall acceptance rate for this conference was 66%, and oral presentations were particularly prestigious, with just 30% of the submissions receiving this honor.
He talked about the work he's been doing to model the electrical and mechanical dynamics of a solenoid used to deliver tactile feedback.
He showed the structure of the model he is using, plus data he has collected for correlating solenoid force output to applied current and plunger position.
At the end, the group talked about methods for collecting data that can be used to tune the rest of his model's parameters.
He wrote an article about the use of vibrotactile suits for human motor learning, focused on the suit created by Lieberman and Breazeal at MIT.
It is exciting to see the number of groups working on projects in this area, in addition to our lab's own NSF project on this topic.
She gave a research talk entitled 'High-Fidelity Haptic Interfaces for Real, Remote, and Virtual Environments,' and she got a tour of the company's facilities, including the area where they manufacture and test Talon robots.
Dr. Steinbach's group is using ideas from event-based haptics to reduce the data load in networked haptic systems.
After the furniture was disassembled, a whole bunch of facilities workers came by to replace the damaged section of pipe.
You can see our nice new pipe in the left image below, and the right image shows the resulting chaos in the rest of the lab.
It turns out that the pipe that runs along the back wall is leaking, which is making the carpet progressively wetter in the back right corner.
Jamie heroically climbed behind the wall to move some junk that was getting wet, and he put our recycling bins there to catch the water, as you can see in the left photo below.
As you can see in the above right photograph, Ian, Jacob, Mallory, PK, and KJK spent a bunch of time this afternoon moving all of our worldly belongings off of the furniture.
The photo at right shows the authors putting the finishing touches on their submissions on the night of the deadline.
The second paper was accompanied by a three-minute-long video that Anat Bordoley put together to explain the purpose of the paper and its findings.
We held it at Jamie and Elizabeth's lovely house, and a whole bunch of people from the lab attended, including current members, alumni, friends, and family.
We picked their Big Belly Filler catering package, which includes Memphis style dry-rub ribs, barbecued chicken, beef brisket, baked beans, mashed sweet potatoes, macaroni and cheese, coleslaw, cornbread, and rolls.
We set up a bunch of different games in the backyard and also gathered around a fire in the Haptics Group's new fire pit, keeping ourselves warm and toasting marshmallows for s'mores.
Though we constructed a tent by the fire, rain eventually forced us indoors, where we kept ourselves busy by playing The Beatles: Rock Band.
Having his idea chosen for the class means that Jamie and three other students will spend this semester investigating the idea and putting together a business plan around it.
The first paper is Joe's texture modeling work in collaboration with Takashi Yoshioka at JHU, and the second is a condensed version of Kyle's robotics master's thesis work on the iTorqU.
These papers will be reviewed in the coming months, and if they are accepted, we will have the chance to present the research at the conference, which will take place in Anchorage, Alaska, next May. Great work by everyone!
They then started the session off by presenting a ten minute overview of the field and explaining how they thought each of the upcoming presentations fit into that overview.
This work started as a joint class project between Alla's physics-based modeling class and KJK's haptics class in spring of 2008.
The desks, whiteboard, and shelves are all getting disassembled tomorrow to give the workers access to the HVAC system, and demolition starts on Tuesday.
As you can see in the picture at right, we had to wear white 'bunny suits' and blue bonnets in order to enter the operating room.
The visitors and their three counselors then rotated around the lab in small groups and had the opportunity to try out all the demonstrations, asking questions of the students in the lab as they went.
There were over 100 students there, and they really enjoyed the hands-on demos afterward (until a metal tool fell into the power supply, shorted it out, and caused a small explosion, which killed the power supply and ended the demonstration session.)
It was great to get such a diverse group together to talk about this research project and test out the associated hands-on demonstrations.
The main part of the conference was single track, with excellent keynote addresses plus a poster session.
Her presentation was entitled Haptography: Creating Authentic Haptic Feedback from Recordings of Real Interactions, and she also did hands-on demonstrations of haptographic capturing and rendering systems during the poster session and immediately following her talk.
Attendees really liked the idea of haptography, and KJK was honored to be able to share our research results to such a large audience in this venue.
For example, the virtual �click� of the newest Blackberry is a simple application of haptics, since users feel a button clicking that exists only in a virtual realm.
In medical robotics, haptics allows surgeons to �feel� tissues and anatomy, even though they are controlling surgical robotic arms instead of being in direct contact with the patient.
program officer from the National Science Foundation (NSF) recently told Professor Kuchenbecker that they were considering reversing the funding decision on the CAREER grant proposal she submitted in July of 2008.
KJK's proposal from 2008 was rated competitive but not initially funded, so it was wonderful to hear that it was being reconsidered due to the availability of stimulus funds from the American Recovery and Reinvestment Act (ARRA) of 2009.
His independent study project was entitled 'Haptic Display of Realistic Tool Contact Via Dynamically Compensated Control of a Dedicated Actuator,' and the picture to the right shows Will demonstrating his system to Mike Carchidi during the exam.
The paper had the same title as his independent study report, and it gives preliminary approaches and results for our haptographic rendering project.
His talk was entitled 'Improving the Realism of Haptic Perceptions in Virtual Arthroscopy Training,' and he showed some nice results on adding vibrotactile feedback to medical simulations and designing mechanisms for increased output impedance.
There were lots of interesting questions afterward, and people seemed quite interested in this new approach to providing haptic feedback in large virtual environments like the SIG Center for Computer Graphics we have here at Penn.
This three-year project will include the development of new modular tactors that can deliver rich vibratory stimuli to a variety of locations on the user's skin, plus tactile actuation patterns for spatially distributed tactors that can intuitively guide the wearer's body movements.
There were many cool demos, including a bone screw insertion simulator and an interactive museum exhibit that lets you bounce different size balls on different planets.
Ralph presented four hour-long modules on the design of magnetic actuators, his minifactory project, haptics and teleoperation, and future research.
The highlights were probably seeing all the wonderful magnetically actuated devices he brought along for show and tell and watching the time-lapse video he made to showcase the manufacturing process for the tiles for his minifactory floor.
As you can see in the photos above, she bought english and metric assortments of stainless steel socket head cap screws for us to use in the lab.
Their proof-of-concept prototype has a 30 by 30 matrix of 4-40 screws that are driven up and down by a mobile set of four actuators.
The other team that was mentored by Suzanne Erb was Sumito Ahuja and Brian Hylton, who created the 'Haptic Compass,' a handheld device that allows a visually impaired individual to feel the direction of North to aid in navigation.
The lab was also very proud to hear that Matt, Travis, and Kate's project won the Couloucoundis Prize for the best presentation of a senior design project - a wonderful honor.
Takashi studies the neural mechanisms that underlie tactile perception and object recognition, which is wonderfully synergistic with the haptographic capture and rendering research we are doing.
Takashi and Graham attended part of our group meeting, and each project team gave a short explanation of their project, along with a demo (if available).
The visitors were quite excited by Will's teleoperation demo because it enables the user to feel the subtle intricacies of a surface without being in direct contact with it.
We are excited to be collaborating with Takashi's group because our combined sets of knowledge will enable us to study many interesting issues in texture perception and virtual surface creation.
She started off her day at Penn by joining in on the MEAM 625 Haptics class to discuss her upcoming IEEE Transactions on Haptics paper (which the class had all just read) that compares parametric knob dynamic models chosen by humans with those identified by the Haptic Camera.
The class really enjoyed having one of the authors present to provide context for the work and help explain relevant issues.
She spent the rest of the day meeting with other Penn faculty and hanging out in the Haptics Lab, feeling demos of our different projects and discussing the work with our students.
In the spirit of April Fool's, one of our lab members updated the lab wiki's people page to include silly pictures of everyone except himself.
These antics were discovered only a few minutes before a big lab celebration, so they provided ample excitement and laughter.
The lab party in the evening was arranged by Will, Pulkit, and Joe to give everyone the opportunity to celebrate our lab's good showing and Will's Best Demo award at World Haptics.
The goal was to let the user feel the difference between a standard position-position controller and this same controller plus high-frequency contact vibrations, which are created by a dedicated actuator to match the accelerations experienced by the slave tool.
Pulkit and Sunthar continued running their hands-on demo on the use of vibrotactile feedback for guiding arm motions, and Will continued running his demo on realistic contact accelerations.
Joe and Kyle gave their two-minute poster teasers today, and the SlipGlove and iTorqU demos were running all day, with lots of hard work by Steve, Nathan, Joe, Jamie, and Kyle.
The travel grant will completely fund his roundtrip airfare (approximately $1000) and provide $500 to support two days of visits to Japanese research labs studying robotics.
It produces these torques using the gyroscopic effect, where a quickly spinning flywheel is steered in different directions.
We envision it being useful for applications such as immersive gaming, upper-limb rehabilitation, and remote control of aerial vehicles.
They created an initial 1-dof prototype that semester, and they worked (very hard) over the summer to create the 2-dof version you can see in the Gazette picture.
It was great to have another haptics / medical robotics professor on campus, and we hope to build a good relationship with Dr. Desai's lab at the University of Maryland.
We saw a wide range of technology currently being used for clinical simulation, including SimMan, laparoscopic box trainers, and a Mimic.
On the fun side, Professor Kuchenbecker participated in the annual 'Pie Your Professor' event, where students can donate a small amount to charity to have the privilege of throwing whipping-cream pies at members of the faculty.
Joe and KJK spent most of this week at MMVR 2009, where Joe was presenting the needle puncture simulation he created for the final project in Alla Safonova's physics-based animation class.
In the afternoon, she got to participate in an abdominal aortic aneurysm simulation alongside two PGY-1 trainees (first year surgical residents), overseen by the Chief of Vascular Surgery at the Philadelphia VA.
The simulation took place in an authentic operating room - the entire SimCenter facility used to be the OR suite for Graduate Hospital.
After donning scrubs and gloves, KJK was given the opportunity to rotate through as the scrub nurse, assistant surgeon, and surgeon.
The team cut open the simulated aneurysm (distended fake rubber organ), though which was flowing simulated blood.
ROSCon 2016 Diversity Scholarships
Save the date: we’re happy to announce that ROSCon 2017 will be held September
was record-breaking in every way, with over 450 attendees and a 60% increase over last year in sponsorship.
You can browse them here or find them linked below in the program as well as the slides that the presenters have submitted.
The ROSCon planning committee acknowledges that the barriers to attendance for traditionally under-represented groups may be many and varied, and we are striving throughout the planning process to make the event as inclusive and accessible as possible. This
year we are proud to have launched the ROSCon Diversity Scholarship Program to help make ROSCon 2016 more representative of the global ROS community.
are 2-3 minute talks, one session each day (3 minutes in the past but this may be reducing to 2.5 or 2 minutes this year). The
way that it works is that you find the designated person during the morning coffee break to sign up for a slot to present that afternoon. It’s
part of the sign-up, you have the option to provide 2-3 presentation slides that will be loaded onto a common laptop for presentation. Slides
years talks can be seen in this blog post or browse the recordings from past years.
There will also be open space for Birds-of-a-Feather (BoF) meetings, impromptu hacking sessions, and informal presentations.
captains log 0126: stickerpacks
UPDATE NUMERO TWO: sale is winding down, just calendar preorders left, and those will end tomorrow (thursday), so thanks again for all the orders and kind words.
lotta figuring stuff out on her part, we can't combine all orders because somethings (pins) don't ship great with other things (posters), so its kinda a balancing act, what can ship together and what can't.
and let me tell you, any kid under the age of, like, 18, is going to be bummed to get those as stocking stuffers.let me know if you have any questions, i tried to set things up correctly but its a lot to manage, and sarah usually runs the sale itself while i run off into the woods.