A subscription to JoVE is required to view this content. Sign in or start your free trial.
Here, we present a protocol to demonstrate a behavioral assay that quantifies how alternative visual features, such as motion cues, influence directional decisions in fish. Representative data are presented on the speed and accuracy where Golden Shiner (Notemigonus crysoleucas) follow virtual fish movements.
Collective animal behavior arises from individual motivations and social interactions that are critical for individual fitness. Fish have long inspired investigations into collective motion, specifically, their ability to integrate environmental and social information across ecological contexts. This demonstration illustrates techniques used for quantifying behavioral responses of fish, in this case, Golden Shiner (Notemigonus crysoleucas), to visual stimuli using computer visualization and digital image analysis. Recent advancements in computer visualization allow for empirical testing in the lab where visual features can be controlled and finely manipulated to isolate the mechanisms of social interactions. The purpose of this method is to isolate visual features that can influence the directional decisions of the individual, whether solitary or with groups. This protocol provides specifics on the physical Y-maze domain, recording equipment, settings and calibrations of the projector and animation, experimental steps and data analyses. These techniques demonstrate that computer animation can elicit biologically-meaningful responses. Moreover, the techniques are easily adaptable to test alternative hypotheses, domains, and species for a broad range of experimental applications. The use of virtual stimuli allows for the reduction and replacement of the number of live animals required, and consequently reduces laboratory overhead.
This demonstration tests the hypothesis that small relative differences in the movement speeds (2 body lengths per second) of virtual conspecifics will improve the speed and accuracy with which shiners follow the directional cues provided by the virtual silhouettes. Results show that shiners directional decisions are significantly affected by increases in the speed of the visual cues, even in the presence of background noise (67% image coherency). In the absence of any motion cues, subjects chose their directions at random. The relationship between decision speed and cue speed was variable and increases in cue speed had a modestly disproportionate influence on directional accuracy.
Animals sense and interpret their habitat continuously to make informed decisions when interacting with others and navigating noisy surroundings. Individuals can enhance their situational awareness and decision making by integrating social information into their actions. Social information, however, largely stems from inference through unintended cues (i.e., sudden maneuvers to avoid a predator), which can be unreliable, rather than through direct signals that have evolved to communicate specific messages (e.g., the waggle dance in honey bees)1. Identifying how individuals rapidly assess the value of social cues, or any sensory information, can be a challenging task for investigators, particularly when individuals are traveling in groups. Vision plays an important role in governing social interactions2,3,4 and studies have inferred the interaction networks that may arise in fish schools based on each individual’s field of view5,6. Fish schools are dynamic systems, however, making it difficult to isolate individual responses to particular features, or neighbor behaviors, due to the inherent collinearities and confounding factors that arise from the interactions among group members. The purpose of this protocol is to complement current work by isolating how alternative visual features can influence the directional decisions of individuals traveling alone or within groups.
The benefit of the current protocol is to combine a manipulative experiment with computer visualization techniques to isolate the elementary visual features an individual may experience in nature. Specifically, the Y-maze (Figure 1) is used to collapse directional choice to a binary response and introduce computer animated images designed to mimic the swimming behaviors of virtual neighbors. These images are projected up from below the maze to mimic the silhouettes of conspecifics swimming beneath one or more subjects. The visual characteristics of these silhouettes, such as their morphology, speed, coherency, and swimming behavior are easily tailored to test alternative hypotheses7.
This paper demonstrates the utility of this approach by isolating how individuals of a model social fish species, the Golden Shiner (Notemigonus crysoleucas), respond to the relative speed of virtual neighbors. The protocol focus, here, is on whether the directional influence of virtual neighbors change with their speed and, if so, quantifying the form of the observed relationship. In particular, the directional cue is generated by having a fixed proportion of the silhouettes act as leaders and move ballistically towards one arm or another. The remaining silhouettes act as distractors by moving about at random to provide background noise that can be tuned by adjusting the leader/distractor ratio. The ratio of leaders to distractors captures the coherency of the directional cues and can be adjusted accordingly. Distractor silhouettes remain confined to the decision area (“DA”, Figure 1A) by having the silhouettes reflect off of the boundary. Leader silhouettes, however, are allowed to leave the DA region and enter their designated arm before slowly fading away once the silhouettes traversed 1/3 the length of the arm. As leaders leave the DA, new leader silhouettes take their place and retrace their exact path to ensure that the leader/distractor ratio remains constant in the DA throughout the experiment.
The use of virtual fish allows for the control of the visual sensory information, while monitoring the directional response of the subject, which may reveal novel features of social navigation, movement, or decision making in groups. The approach used here can be applied to a broad range of questions, such as effects of sublethal stress or predation on social interactions, by manipulating the computer animation to produce behavioral patterns of varying complexity.
All experimental protocols were approved by the Institutional Animal Care and Use Committee of the Environmental Laboratory, US Army Engineer and Research and Development Center, Vicksburg, MS, USA (IACUC# 2013-3284-01).
1. Sensory maze design
2. Recording equipment
3. Calibrate lighting, projector, and camera settings
4. Calibrate visual projection program: background
5. Calibrate visual projection program: visual stimuli
NOTE: Rendering and animating the visual stimuli can also be done in Processing using the steps below as guides along with the platform’s tutorials. A schematic of the current program’s logic is provided in (Figure 3) and additional details can be found in Lemasson et al. (2018)7. The following steps provide examples of the calibration steps taken in the current experiment.
6. Animal preparation
7. Experimental procedure
8. Data Analysis
Hypothesis and design
To demonstrate the utility of this experimental system we tested the hypothesis that the accuracy with which Golden Shiner follow a visual cue will improve with the speed of that cue. Wild type Golden Shiner were used (N = 16, body lengths, BL, and wet weights, WW, were 63.4 ± 3.5 mm and 1.8 ± 0.3 g, respectfully). The coherency of the visual stimuli (leader/distractor r...
Visual cues are known to trigger an optomotor response in fish exposed to black and white gratings13 and there is increasing theoretical and empirical evidence that neighbor speed plays an influential role in governing the dynamical interactions observed in fish schools7,14,15,16,17. Contrasting hypotheses exist to explain how individua...
All authors contributed to the experimental design, analyses and writing the paper. A.C.U. and C.M.W. setup and collected the data. The authors have nothing to disclose.
We thank Bryton Hixson for setup assistance. This program was supported by the Basic Research Program, Environmental Quality and Installations (EQI; Dr. Elizabeth Ferguson, Technical Director), US Army Engineer Research and Development Center.
Name | Company | Catalog Number | Comments |
Black and white IP camera | Noldus, Leesburg, VA, USA | https://www.noldus.com/ | |
Extruded aluminum | 80/20 Inc., Columbia City, IN, USA | 3030-S | https://www.8020.net 3.00" X 3.00" Smooth T-Slotted Profile, Eight Open T-Slots |
Finfish Starter with Vpak, 1.5 mm extruded pellets | Zeigler Bros. Inc., Gardners, PA, USA | http://www.zeiglerfeed.com/ | |
Golden shiners | Saul Minnow Farm, AR, USA | http://saulminnow.com/ | |
ImageJ (v 1.52h) freeware | National Institute for Health (NIH), USA | https://imagej.nih.gov/ij/ | |
LED track lighting | Lithonia Lightening, Conyers, GA, USA | BR20MW-M4 | https://lithonia.acuitybrands.com/residential-track |
Oracle 651 white cut vinyl | 651Vinyl, Louisville, KY, USA | 651-010M-12:5ft | http://www.651vinyl.com. Can order various sizes. |
PowerLite 570 overhead projector | Epson, Long Beach CA, USA | V11H605020 | https://epson.com/For-Work/Projectors/Classroom/PowerLite-570-XGA-3LCD-Projector/p/V11H605020 |
Processing (v 3) freeware | Processing Foundation | https://processing.org/ | |
R (3.5.1) freeware | The R Project for Statistical Computing | https://www.r-project.org/ | |
Ultra-white 360 theater screen | Alternative Screen Solutions, Clinton, MI, USA | 1950 | https://www.gooscreen.com. Must call for special cut size |
Z-Hab system | Pentair Aquatic Ecosystems, Apopka, FL, USA | https://pentairaes.com/. Call for details and sizing. |
Request permission to reuse the text or figures of this JoVE article
Request PermissionThis article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. All rights reserved