Tactile Perception Haptic Exploration And Map Rendering For Robots That Operate Within Granular Materials


Download Tactile Perception Haptic Exploration And Map Rendering For Robots That Operate Within Granular Materials PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Tactile Perception Haptic Exploration And Map Rendering For Robots That Operate Within Granular Materials book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.

Download

Tactile Perception, Haptic Exploration, and Map Rendering for Robots that Operate Within Granular Materials


Tactile Perception, Haptic Exploration, and Map Rendering for Robots that Operate Within Granular Materials

Author: Jia Shengxin

language: en

Publisher:

Release Date: 2022


DOWNLOAD





Robots are expected to operate autonomously in unstructured, real-world environments. For effective physical interaction with the world, robots must build and refine their understanding of the environment through sensory feedback. However, tactile feedback has been used primarily in open-air environments and not within granular materials. When robots operate within opaque granular materials, tactile and proprioceptive feedback can be more informative than visual feedback. Our long-term objective is to leverage tactile sensors to develop efficient algorithms that enable robots to infer environmental conditions and to plan exploratory movements that reduce uncertainty in their models of the world. Motivated by the need to keep humans out of harm's way in search and rescue or other field environments, we address the challenge of using tactile feedback to locate objects buried in granular materials. In study \#1, we designed a tactile perception pipeline for sensorized robot fingertips that directly interact with granular materials in teleoperated systems. We proposed an architecture called the Sparse-Fusion Recurrent Neural Network (SF-RNN) to detect contact with an object buried within granular materials. We leveraged multimodal tactile sensor data in order to classify contact states within five different granular materials. We also constructed a belief map that combines probabilistic contact state estimates and fingertip location. In study \#2, we developed a framework for tactile perception, mapping, and haptic exploration for the autonomous localization of objects buried in granular materials. The haptic exploration task was performed within densely packed sand mixtures using sensor models that account for granular material characteristics and aid in the interpretation of interaction forces between the robot and its environment. The haptic exploration strategy was designed to efficiently locate and refine the outline of a buried object while simultaneously minimizing potentially damaging physical interactions with the object. Continuous occupancy maps were generated that fused local, sparse tactile information into global maps. In summary, we developed tactile-based frameworks for perception, planning, and mapping for the challenging task of localizing objects buried within granular materials. Our work can serve as a foundation for more complex, autonomous robotic behaviors such as the excavation and bimanual retrieval of fragile, buried objects.

Visuo-tactile Perception for Dexterous Robotic Manipulation


Visuo-tactile Perception for Dexterous Robotic Manipulation

Author: Maria Bauza Villalonga

language: en

Publisher:

Release Date: 2022


DOWNLOAD





In this thesis, we develop visuo-tactile perception to enable general and precise robotic manipulation. In particular, we want to study how to effectively process visual and tactile information to allow robots to expand their capabilities while remaining accurate and reliable. We begin our work by focusing on developing tools for tactile perception. For the task of grasping, we use tactile observations to assess and improve grasp stability. Tactile information also allows extracting geometric information from contacts which is a task-independent feature. By learning to map tactile observations to contact shapes, we show robots can reconstruct accurate 3D models of objects, which can later be used for pose estimation. We build on the idea of using geometric information from contacts by developing tools that accurately render contact geometry in simulation. This enables us to develop a probabilistic approach to pose estimation for novel objects based on matching real visuo-tactile observations to a set of simulated ones. As a result, our method does not rely on real data and yields accurate pose distributions. Finally, we demonstrate how this approach to perception enables precise manipulations. In particular, we consider the task of precise pick-and-place of novel objects. Combining perception with task-aware planning, we build a robotic system that identifies in simulation which object grasps will facilitate grasping, planning, and perception; and selects the best one during execution. Our approach adapts to new objects by learning object-dependent models purely in simulation, allowing a robot to manipulate new objects successfully and perform highly accurate placements.

Robotic Tactile Perception and Understanding


Robotic Tactile Perception and Understanding

Author: Huaping Liu

language: en

Publisher: Springer

Release Date: 2018-03-20


DOWNLOAD





This book introduces the challenges of robotic tactile perception and task understanding, and describes an advanced approach based on machine learning and sparse coding techniques. Further, a set of structured sparse coding models is developed to address the issues of dynamic tactile sensing. The book then proves that the proposed framework is effective in solving the problems of multi-finger tactile object recognition, multi-label tactile adjective recognition and multi-category material analysis, which are all challenging practical problems in the fields of robotics and automation. The proposed sparse coding model can be used to tackle the challenging visual-tactile fusion recognition problem, and the book develops a series of efficient optimization algorithms to implement the model. It is suitable as a reference book for graduate students with a basic knowledge of machine learning as well as professional researchers interested in robotic tactile perception and understanding, and machine learning.