Overview Statistic: PDF-Downloads (blue) and Frontdoor-Views (gray)

A Kendall Shape Space Approach to 3D Shape Estimation from 2D Landmarks

  • 3D shapes provide substantially more information than 2D images. However, the acquisition of 3D shapes is sometimes very difficult or even impossible in comparison with acquiring 2D images, making it necessary to derive the 3D shape from 2D images. Although this is, in general, a mathematically ill-posed problem, it might be solved by constraining the problem formulation using prior information. Here, we present a new approach based on Kendall’s shape space to reconstruct 3D shapes from single monocular 2D images. The work is motivated by an application to study the feeding behavior of the basking shark, an endangered species whose massive size and mobility render 3D shape data nearly impossible to obtain, hampering understanding of their feeding behaviors and ecology. 2D images of these animals in feeding position, however, are readily available. We compare our approach with state-of-the-art shape-based approaches both on human stick models and on shark head skeletons. Using a small set of training shapes, we show that the Kendall shape space approach is substantially more robust than previous methods and always results in plausible shapes. This is essential for the motivating application in which specimens are rare and therefore only few training shapes are available.

Export metadata

Additional Services

Share in Twitter Search Google Scholar Statistics - number of accesses to the document
Metadaten
Author:Martha Paskin, Mason Dean, Daniel BaumORCiD, Christoph von Tycowicz
Document Type:In Proceedings
Parent Title (English):Computer Vision -- ECCV 2022
First Page:363
Last Page:379
Publisher:Springer Nature Switzerland
Year of first publication:2022
ArXiv Id:http://arxiv.org/abs/2207.12687
DOI:https://doi.org/10.1007/978-3-031-20086-1_21
Accept ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.