Engineers Develop a Robotic Gripper with Rich Sensory Capabilities


SOURCE: AZOROBOTICS.COM/
APR 18, 2022

It is difficult to get rid of the image of a massive metallic robot that communicates in monotone and keeps moving in lumbering, deliberate steps. Soft robotics practitioners, on the other hand, have a completely different vision in mind: autonomous devices made up of flexible parts that are gentle to the touch, more akin to human fingers than R2-D2 or Robby the Robot.

Engineers Develop a Robotic Gripper with Rich Sensory Capabilities.

The GelSight Fin Ray gripper holds a glass Mason jar with its tactile sensing. Image Credit: Massachusetts Institute of Technology, Computer Science and Artificial Intelligence Laboratory.

Professor Edward Adelson and his Perceptual Science Group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are now pursuing this model. Adelson and Sandra Liu, a mechanical engineering Ph.D. student at CSAIL, recently developed a robotic gripper that uses novel “GelSight Fin Ray” fingers that are supple enough to manipulate objects like a human hand.

Liu and Adelson’s gripper is equipped with touch sensors that can match or exceed the sensitivity of human skin, which sets it apart from other efforts in the field.

The observations were presented last week at the 2022 IEEE 5th International Conference on Soft Robotics.

Thanks to a discovery made by German biologist Leif Kniese in 1997, the fin ray has become a popular item in soft robotics. He noticed that when he pressed his finger against a fish’s tail, the ray bent toward him, almost embracing his finger, instead of tilting away. Although the design is popular, it did lack tactile sensitivity.

It’s versatile because it can passively adapt to different shapes and therefore grasp a variety of objects. But in order to go beyond what others in the field had already done, we set out to incorporate a rich tactile sensor into our gripper.

Sandra Liu, Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology

The gripper is made up of two flexible fin ray fingers that measure up to the shape of whatever they touch. The fingers are made of flexible plastic materials printed on a 3D printer, which is fairly common in the field.

Typical soft robotic gripper fingers, on the other hand, have supportive cross-struts running the length of their interiors, whereas Liu and Adelson hollowed out the interior to make room for their camera and other sensory components.

Related Stories

On one end of the hollowed-out cavity, the camera is placed on a semirigid backing that is illuminated by LEDs. The camera is positioned in front of a layer of “sensory” pads made of silicone gel (called “GelSight”) fastened to a thin layer of acrylic material. The plastic finger piece at the extreme side of the inner cavity is attached to the acrylic sheet. When something is touched, the finger will seamlessly fold around it, blending into its contours.

The camera, along with accompanying computational algorithms, can evaluate the overall shape of the object, its orientation in space, its surface roughness and the force being applied by (and imparted to) each finger by determining how the silicone and acrylic sheets are deformed during this interaction.

Liu and Adelson put their gripper through its paces in an experiment where only one of the two fingers was “sensorized.” A plastic strawberry, a mini-screwdriver, a Ball Mason jar, an acrylic paint tube, and a wine glass were all successfully handled by their device.

The internal sensor was able to identify the “seeds” on the surface of the fake strawberry while the gripper was holding it. The fingers grasped the paint tube without squeezing it so hard that it burst and spilled its contents.

The GelSight sensor was even able to read the lettering on the Mason jar, and it did so in an ingenious manner. The overall shape of the jar was first determined by looking at how the acrylic sheet bent when wrapped around it. The pattern was then subtracted from the deformation of the silicone pad using a computer algorithm, leaving only the subtle deformation caused by the letters.

Due to the refraction of light, glass objects pose a challenge for vision-based robots. Such optical ambiguity has no effect on tactile sensors. The gripper could feel the orientation of the stem when it picked up the wine glass and made sure it was pointing straight up before slowly lowering it.

The gel pad detected contact when the base made contact with the tabletop. Seven out of ten trials had a proper placement and, thankfully, no glass was broken during the experiment's filming.

Sensing with soft robots has been a big challenge, because it is difficult to set up sensors—which are traditionally rigid—on soft bodies. This paper provides a neat solution to that problem.

Wenzhen Yuan, Assistant Professor, Robotics Institute, Carnegie Mellon University

Yuan was not part of the study. “The authors used a very smart design to make their vision-based sensor work for the compliant gripper, in this way generating very good results when robots grasp objects or interact with the external environment. The technology has lots of potential to be widely used for robotic grippers in real-world environments,” Yuan added.

The GelSight Fin Ray has a lot of potential applications, but Liu and Adelson are working on some improvements first. They introduced structural instability, a tendency to twist, by hollowing out the finger to make room for their sensory system, which they believe can be mitigated through better design.

The scientists want to develop GelSight sensors that can work with soft robots developed by other research groups. They also intend to create a three-finger gripper that could be useful for tasks like picking up fruit and determining its ripeness.

In their approach, tactile sensing is attributed to low components such as a camera, a gel and LEDs. Liu hopes “it may be possible to come up with sensors that are both practical and affordable,” using technology like GelSight. At the very least, that is one of the goals she and others in the laboratory are working toward.

This research was funded by the Toyota Research Institute and the US Office of Naval Research.


Similar articles you can read