Automated video analysis of animal movements using gabor orientation filters

6Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

To quantify locomotory behavior, tools for determining the location and shape of an animal's body are a first requirement. Video recording is a convenient technology to store raw movement data, but extracting body coordinates from video recordings is a nontrivial task. The algorithm described in this paper solves this task for videos of leeches or other quasi-linear animals in a manner inspired by the mammalian visual processing system: the video frames are fed through a bank of Gabor filters, which locally detect segments of the animal at a particular orientation. The algorithm assumes that the image location with maximal filter output lies on the animal's body and traces its shape out in both directions from there. The algorithm successfully extracted location and shape information from video clips of swimming leeches, as well as from still photographs of swimming and crawling snakes. A Matlab implementation with a graphical user interface is available online, and should make this algorithm conveniently usable in many other contexts. © The Author(s) 2010.

Cite

CITATION STYLE

APA

Wagenaar, D. A., & Kristan, W. B. (2010). Automated video analysis of animal movements using gabor orientation filters. Neuroinformatics, 8(1), 33–42. https://doi.org/10.1007/s12021-010-9062-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free