Nonlinear distance and velocity estimation from optic flow
ORAL
Abstract
Vision is widely used by animals and robots for position and velocity estimation. Robots typically use stereopsis or feature recognition to extract distance from image data. Stereopsis, however, requires 2 calibrated cameras positioned at a sufficiently large distance, and feature based methods require computationally expensive image processing. These limitations preclude small animals and robots from estimating distance in this manner. What alternative solutions exist? I will review a nonlinear estimation method for separating distance and velocity information from optic flow generated by a single dynamically moving camera. Unfortunately, this approach requires calculating the time derivative of optic flow, a notoriously noisy signal. To overcome this limitation, we use a new neural network optic flow estimator, flownet2, and total variation regularization methods to estimate smooth derivatives. These steps allow us to independently measure distance and velocity from a single dynamically moving camera. We propose that this approach to integrating multiple sensory modalities during dynamic motion, which we refer to as idiokinemetry, is likely a general feature of how animals perceive the world, and may inspire the development of smaller and more robust robotic systems.
–
Presenters
-
Floris van Breugel
University of Nevada, Reno
Authors
-
Floris van Breugel
University of Nevada, Reno