Fusing time-of-flight depth and color for real-time segmentation and tracking

Amit Bleiweiss*, Michael Werman

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

66 Scopus citations

Abstract

We present an improved framework for real-time segmentation and tracking by fusing depth and RGB color data. We are able to solve common problems seen in tracking and segmentation of RGB images, such as occlusions, fast motion, and objects of similar color. Our proposed real-time mean shift based algorithm outperforms the current state of the art and is significantly better in difficult scenarios.

Original languageEnglish
Title of host publicationDynamic 3D Imaging - DAGM 2009 Workshop, Dyn3D 2009, Proceedings
Pages58-69
Number of pages12
DOIs
StatePublished - 2009
EventDAGM Workshop on Dynamic 3D Imaging, Dyn3D 2009 - Jena, Germany
Duration: 9 Sep 20099 Sep 2009

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume5742 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceDAGM Workshop on Dynamic 3D Imaging, Dyn3D 2009
Country/TerritoryGermany
CityJena
Period9/09/099/09/09

Fingerprint

Dive into the research topics of 'Fusing time-of-flight depth and color for real-time segmentation and tracking'. Together they form a unique fingerprint.

Cite this