Nina McCambridge/ Lead Copy Editor

On Friday, April 12, the Robotics Institute offered tours of some of its laboratories.

First, the tour visited the Human and Robot Partners Lab, where two researchers, Maggie Collier and Michael Lee, demonstrated their joystick-controlled robot arm, part of a project that intends to use robotics to improve the lives of wheelchair users. Due to the difficulty of controlling a robot arm manually, their research combines human input with automation. However, they noted that there’s such a thing as too much automation. People like to feel in control — especially of devices that are essential to their daily lives. 

The researchers are working on building relationships with disabled people in order to better fit their needs. Another researcher demonstrated his project where a robot learns to play a game by incorporating various forms of human criticism.

In the Intelligent Autonomous Manipulation Lab, researchers demonstrated their dexterous manipulation robots. “We have like 10 or 20 robots total,” said Mark Lee, a Ph.D. student who does research there. “I think as a general lab research direction, we focus on very different aspects, from the entire software stack from the hardware to sensors to the manipulation, skill learning, and now towards more recent trends like imitation learning.” 

One robot had a human-like hand and learned how to manipulate an object by learning from a human demonstration. Another robot, presented by graduate student Janice Lee, had a non-anthropomorphic hand which consisted of four fingers in a rectangular formation. Humans operated this by moving the fingers of a replica of the robot hand. Given the robot’s configuration, two human hands were needed to operate it. 

Another non-anthropomorphic hand they were working on was a hand with 64 fingers, described by Ph.D. student Sarvesh Patil. This could be used to perform manganese node mining with minimal disturbance to coral reefs. Manganese nodes are often mined in a way that disturbs the seafloor, but a dextrous robot could pick them up more carefully.

Next was the Search-Based Planning Lab, where researchers demonstrated their robot arm which uses a sort of wooden shield to deflect rubber balls thrown at it. It senses the balls using a camera. 

According to one of the researchers, “between the time the robot can actually see the ball and the time it has to actually deflect it, there are only 300 to 350 milliseconds.” Therefore, it has only a very short period of time to plan its motion. 

As the name of the lab would suggest, it uses search-based planning, searching through different movement options and quickly narrowing down which action it should take based on the motion of the ball. 

In the demonstration, the robot deflected the ball as long as it wasn’t thrown at a strange angle. They also plan on creating a robot that can deflect objects thrown from many angles in quick succession. They have built a simulation of this.

There was also a showcase of other projects from the Robotics Institute. There was a painting robot that made colorful works in watercolor. It was reminiscent of the automaton from Hugo Cabret, only it was a robot rather than an automaton, and it could paint anything. There was a weaving robot, which could weave small, colorful scarf-like works in any grid-based pattern. 

Nina McCambridge/Lead Copy Editor

These were only a couple among the many wonderful showcased projects and robots. Students, faculty, staff, and  alumni visiting for Carnival attended.

Author

Leave a Reply

Discover more from The Tartan

Subscribe now to keep reading and get access to the full archive.

Continue reading