Active clothing perceptionThe goal of this project is to build a robot system that can autonomously explore the properties of natural clothes. An external Kinect sensor guides the robot to move to the proper positions on the clothing for tactile exploration, and then the robot squeezes the clothing with a GelSight finger. We applied CNN to learn multiple clothing properties from the tactile data. The tactile output was used to improve the robotic exploration as well.
|