Dash logoUC Irvine logo

Data from: Multichannel Stroboscopic Videography (MSV): A technique for visualizing multiple channels for behavioral measurements.

Citation

Soto, Alberto; McHenry, Matthew; Po, Theodora (2019), Data from: Multichannel Stroboscopic Videography (MSV): A technique for visualizing multiple channels for behavioral measurements., UC Irvine Dash, Dataset, https://doi.org/10.7280/D1R67V

Abstract

Multichannel visualization allows biologists to integrate disparate types of information about an organism using distinct optical conditions for each channel. The advantages of a multichannel approach have largely not been applied to behavioral studies due to the challenge posed by a moving subject. Here we address this challenge with the technique of Multichannel Stroboscopic Videography (MSV), which operates by synchronizing the activation of multiple light sources with the exposures of a video camera to generate multichannel visualizations. We illustrate the utility of this approach with experiments on a walking cockroach (Gromphadorhina portentosa ) and a swimming fish (Danio rerio).By illuminating the body and points of contact for the feet of the cockroach separately, we were able to produce high-contrast images for automated kinematic analysis.
For the swimming fish we measured body kinematics with transmitted illumination in one channel and water flow with Particle Image Velocimetry using laser light in another channel. These experiments demonstrate the enhanced potential for automated analysis as well as a capacity to integrate changes in physiological or environmental conditions in a freely behaving animal through MSV. MSV has the potential for broad utility in biological and engineering research.

Usage Notes

The data package contains the raw image sequences for both experimental tests (cockroach and zebrafish) of MSV, the Matlab codes used to analyze the images, and the output data from the Matlab scripts. These data have not been post-processed. The manuscript figures contain processed versions of these data.

Funding

National Science Foundation, Award: DGE-1839285

National Science Foundation, Award: IOS-1354842

Office of Naval Research, Award: N00014-15-1-2249

Office of Naval Research, Award: N00014-19-1-2035