Wenzhen Yuan
  • Home
  • GelSight Projects
  • Other Projects
  • Publications
Picture

I am an assistant professor in the Robotics Insititute (RI) at Carnegie Mellon University and the director of the CMU RoboTouch Lab. Previously, I received my PhD and Master degree from MIT, and BE degree from Tsinghua University.

My research goal is to build an intelligent robotic tactile perception system that helps the robot to better understand and interact with the physical world. For understanding the physical world, the robots need to understand the properties of the objects around it, and lots of the very important properties, such as hardness, roughness, and slipperiness, can only be learned through physical contact. I work on designing frameworks for robots to touch the target objects in specific ways, and interpretingthe tactile signal from the contact. At the same time, I have been looking for ways for robots to interact with the world with the help of tactile sensing. The tactile signals contain rich information about the robot’s interaction with the environment, and that information could guide robots to accomplish different manipulation tasks more dexterously. My research focuses on how to extract that information, and how to add those useful feedbacks into the robot manipulation framework.

I have been trying to address the challenges in robotic touch from three aspects: hardware design, algorithm development, and system integration. On the hardware side, I have been working with a high-resolution tactile sensor GelSight, which obtains amazingly fine details about the shape of the contact object. The GelSight sensor enables robots to get much more information about touch. I am trying to improve the sensor’s design for different robot applications, and making the sensor accessible to more people. On the algorithm side, I have been applying convolutional neural networks on the high-dimensional tactile data, and exploring better data representation architectures or machine learning methods on the tactile data. On the integration side, I am enthusiastic about combining tactile sensing with the robot motion and other sensing modalities. The robot perception cannot be achieved with a single sensor, but rather with a collaboration of different sensors. 

For a less technical version of my research, please refer to our demo of a robot estimating the hardness of fruits through touch HERE, or the demo of robot perceiving clothing and sort laundries HERE. For a longer version of the research, please refer to the project page. For a detailed introduction of the GelSight design for robotics, please refer to our review paper.
Contact:
The Robotics Institute
5000 Forbes Ave
Pittsburgh, PA 15213
yuanwz AT cmu DOT edu

Tips for my name pronunciation: my first name can be pronunced like 'When-Jen'.
Last updated: Nov 25, 2019
  • Home
  • GelSight Projects
    • Hardness Estimation
    • Fabric Perception
    • Force, Shear and Slip
  • Other Projects
  • Publications
✕
  • Hardness Estimation
  • Fabric Perception
  • Force, Shear and Slip