Understanding mechanical motion: From images to behaviors

Dar Tzachi, Leo Joskowicz, Ehud Rivlin

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

We present an algorithm for producing behavior descriptions of planar fixed axes mechanical motions from image sequences using a formal behavior language. The language, which covers the most important class of mechanical motions, symbolically captures the qualitative aspects of objects that translate and rotate along axes that are fixed in space. The algorithm exploits the structure of these motions to robustly recover the objects behaviors. It starts by identifying the independently moving objects, their motion parameters, and their variation with respect to time using normal optical flow analysis, iterative motion segmentation, and motion parameter estimation. It then produces a formal description of their behavior by identifying individual uniform motion events and simultaneous motion changes, and parsing them with a motion grammar. We demonstrate the algorithm on three sets of image sequences: mechanisms, everyday situations, and a robot manipulation scenario.

Original languageEnglish
Pages (from-to)147-179
Number of pages33
JournalArtificial Intelligence
Volume112
Issue number1
DOIs
StatePublished - Aug 1999

Bibliographical note

Funding Information:
Leo Joskowicz is supported in part by grant 98/536 from the Israeli Academy of Science, by a grant from the Authority for Research and Development, The Hebrew University of Jerusalem, Israel, and by a Ford University Research Grant, the Ford ADAPT2000 project.

Fingerprint

Dive into the research topics of 'Understanding mechanical motion: From images to behaviors'. Together they form a unique fingerprint.

Cite this