Implicit Object Pose Estimation on RGB Images Using Deep Learning Methods

DSpace Repository


Dateien:

URI: http://hdl.handle.net/10900/146959
http://nbn-resolving.de/urn:nbn:de:bsz:21-dspace-1469599
http://dx.doi.org/10.15496/publikation-88300
Dokumentart: PhDThesis
Date: 2023-10-30
Language: English
Faculty: 7 Mathematisch-Naturwissenschaftliche Fakultät
Department: Informatik
Advisor: Zell, Andreas (Prof. Dr.)
Day of Oral Examination: 2023-09-29
DDC Classifikation: 004 - Data processing and computer science
Keywords: Deep learning , Objekterkennung
Other Keywords:
Pose Estimation
Implicit Neural Representations
License: http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=de http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=en
Order a printed copy: Print-on-Demand
Show full item record

Abstract:

With the rise of robotic and camera systems and the success of deep learning in computer vision, there is growing interest in precisely determining object positions and orientations. This is crucial for tasks like automated bin picking, where a camera sensor analyzes images or point clouds to guide a robotic arm in grasping objects. Pose recognition has broader applications, such as predicting a car's trajectory in autonomous driving or adapting objects in virtual reality based on the viewer's perspective. This dissertation focuses on RGB-based pose estimation methods that use depth information only for refinement, which is a challenging problem. Recent advances in deep learning have made it possible to predict object poses in RGB images, despite challenges like object overlap, object symmetries and more. We introduce two implicit deep learning-based pose estimation methods for RGB images, covering the entire process from data generation to pose selection. Furthermore, theoretical findings on Fourier embeddings are shown to improve the performance of the so-called implicit neural representations - which are then successfully utilized for the task of implicit pose estimation.

This item appears in the following Collection(s)