scholarly journals Rheotaxis revisited: a multi-behavioral and multisensory perspective on how fish orient to flow

2020 ◽  
Vol 223 (23) ◽  
pp. jeb223008
Author(s):  
Sheryl Coombs ◽  
Joe Bak-Coleman ◽  
John Montgomery

ABSTRACTHere, we review fish rheotaxis (orientation to flow) with the goal of placing it within a larger behavioral and multisensory context. Rheotaxis is a flexible behavior that is used by fish in a variety of circumstances: to search for upstream sources of current-borne odors, to intercept invertebrate drift and, in general, to conserve energy while preventing downstream displacement. Sensory information available for rheotaxis includes water-motion cues to the lateral line and body-motion cues to visual, vestibular or tactile senses when fish are swept downstream. Although rheotaxis can be mediated by a single sense, each sense has its own limitations. For example, lateral line cues are limited by the spatial characteristics of flow, visual cues by water visibility, and vestibular and other body-motion cues by the ability of fish to withstand downstream displacement. The ability of multiple senses to compensate for any single-sense limitation enables rheotaxis to persist over a wide range of sensory and flow conditions. Here, we propose a mechanism of rheotaxis that can be activated in parallel by one or more senses; a major component of this mechanism is directional selectivity of central neurons to broad patterns of water and/or body motions. A review of central mechanisms for vertebrate orienting behaviors and optomotor reflexes reveals several motorsensory integration sites in the CNS that could be involved in rheotaxis. As such, rheotaxis provides an excellent opportunity for understanding the multisensory control of a simple vertebrate behavior and how a simple motor act is integrated with others to form complex behaviors.

2000 ◽  
Vol 84 (6) ◽  
pp. 2984-2997 ◽  
Author(s):  
Per Jenmalm ◽  
Seth Dahlstedt ◽  
Roland S. Johansson

Most objects that we manipulate have curved surfaces. We have analyzed how subjects during a prototypical manipulatory task use visual and tactile sensory information for adapting fingertip actions to changes in object curvature. Subjects grasped an elongated object at one end using a precision grip and lifted it while instructed to keep it level. The principal load of the grasp was tangential torque due to the location of the center of mass of the object in relation to the horizontal grip axis joining the centers of the opposing grasp surfaces. The curvature strongly influenced the grip forces required to prevent rotational slips. Likewise the curvature influenced the rotational yield of the grasp that developed under the tangential torque load due to the viscoelastic properties of the fingertip pulps. Subjects scaled the grip forces parametrically with object curvature for grasp stability. Moreover in a curvature-dependent manner, subjects twisted the grasp around the grip axis by a radial flexion of the wrist to keep the desired object orientation despite the rotational yield. To adapt these fingertip actions to object curvature, subjects could use both vision and tactile sensibility integrated with predictive control. During combined blindfolding and digital anesthesia, however, the motor output failed to predict the consequences of the prevailing curvature. Subjects used vision to identify the curvature for efficient feedforward retrieval of grip force requirements before executing the motor commands. Digital anesthesia caused little impairment of grip force control when subjects had vision available, but the adaptation of the twist became delayed. Visual cues about the form of the grasp surface obtained before contact was used to scale the grip force, whereas the scaling of the twist depended on visual cues related to object movement. Thus subjects apparently relied on different visuomotor mechanisms for adaptation of grip force and grasp kinematics. In contrast, blindfolded subjects used tactile cues about the prevailing curvature obtained after contact with the object for feedforward adaptation of both grip force and twist. We conclude that humans use both vision and tactile sensibility for feedforward parametric adaptation of grip forces and grasp kinematics to object curvature. Normal control of the twist action, however, requires digital afferent input, and different visuomotor mechanisms support the control of the grasp twist and the grip force. This differential use of vision may have a bearing to the two-stream model of human visual processing.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Géraldine Fauville ◽  
Anna C. M. Queiroz ◽  
Erika S. Woolsey ◽  
Jonathan W. Kelly ◽  
Jeremy N. Bailenson

AbstractResearch about vection (illusory self-motion) has investigated a wide range of sensory cues and employed various methods and equipment, including use of virtual reality (VR). However, there is currently no research in the field of vection on the impact of floating in water while experiencing VR. Aquatic immersion presents a new and interesting method to potentially enhance vection by reducing conflicting sensory information that is usually experienced when standing or sitting on a stable surface. This study compares vection, visually induced motion sickness, and presence among participants experiencing VR while standing on the ground or floating in water. Results show that vection was significantly enhanced for the participants in the Water condition, whose judgments of self-displacement were larger than those of participants in the Ground condition. No differences in visually induced motion sickness or presence were found between conditions. We discuss the implication of this new type of VR experience for the fields of VR and vection while also discussing future research questions that emerge from our findings.


2016 ◽  
Author(s):  
Janek Meyer ◽  
Hannes Renzsch ◽  
Kai Graf ◽  
Thomas Slawig

While plain vanilla OpenFOAM has strong capabilities with regards to quite a few typical CFD-tasks, some problems actually require additional bespoke solvers and numerics for efficient computation of high-quality results. One of the fields requiring these additions is the computation of large-scale free-surface flows as found e.g. in naval architecture. This holds especially for the flow around typical modern yacht hulls, often planing, sometimes with surface-piercing appendages. Particular challenges include, but are not limited to, breaking waves, sharpness of interface, numerical ventilation (aka streaking) and a wide range of flow phenomenon scales. A new OF-based application including newly implemented discretization schemes, gradient computation and rigid body motion computation is described. In the following the new code will be validated against published experimental data; the effect on accuracy, computational time and solver stability will be shown by comparison to standard OF-solvers (interFoam / interDyMFoam) and Star CCM+. The code’s capabilities to simulate complex “real-world” flows are shown on a well-known racing yacht design.


2003 ◽  
Vol 89 (1) ◽  
pp. 390-400 ◽  
Author(s):  
L. H. Zupan ◽  
D. M. Merfeld

Sensory systems often provide ambiguous information. For example, otolith organs measure gravito-inertial force (GIF), the sum of gravitational force and inertial force due to linear acceleration. However, according to Einstein's equivalence principle, a change in gravitational force due to tilt is indistinguishable from a change in inertial force due to translation. Therefore the central nervous system (CNS) must use other sensory cues to distinguish tilt from translation. For example, the CNS might use dynamic visual cues indicating rotation to help determine the orientation of gravity (tilt). This, in turn, might influence the neural processes that estimate linear acceleration, since the CNS might estimate gravity and linear acceleration such that the difference between these estimates matches the measured GIF. Depending on specific sensory information inflow, inaccurate estimates of gravity and linear acceleration can occur. Specifically, we predict that illusory tilt caused by roll optokinetic cues should lead to a horizontal vestibuloocular reflex compensatory for an interaural estimate of linear acceleration, even in the absence of actual linear acceleration. To investigate these predictions, we measured eye movements binocularly using infrared video methods in 17 subjects during and after optokinetic stimulation about the subject's nasooccipital (roll) axis (60°/s, clockwise or counterclockwise). The optokinetic stimulation was applied for 60 s followed by 30 s in darkness. We simultaneously measured subjective roll tilt using a somatosensory bar. Each subject was tested in three different orientations: upright, pitched forward 10°, and pitched backward 10°. Five subjects reported significant subjective roll tilt (>10°) in directions consistent with the direction of the optokinetic stimulation. In addition to torsional optokinetic nystagmus and afternystagmus, we measured a horizontal nystagmus to the right during and following clockwise (CW) stimulation and to the left during and following counterclockwise (CCW) stimulation. These measurements match predictions that subjective tilt in the absence of real tilt should induce a nonzero estimate of interaural linear acceleration and, therefore, a horizontal eye response. Furthermore, as predicted, the horizontal response in the dark was larger for Tilters ( n = 5) than for Non-Tilters ( n= 12).


2001 ◽  
Vol 86 (2) ◽  
pp. 692-702 ◽  
Author(s):  
Michaël B. Zugaro ◽  
Eiichi Tabuchi ◽  
Céline Fouquier ◽  
Alain Berthoz ◽  
Sidney I. Wiener

Head direction (HD) cells discharge selectively in macaques, rats, and mice when they orient their head in a specific (“preferred”) direction. Preferred directions are influenced by visual cues as well as idiothetic self-motion cues derived from vestibular, proprioceptive, motor efferent copy, and command signals. To distinguish the relative importance of active locomotor signals, we compared HD cell response properties in 49 anterodorsal thalamic HD cells of six male Long-Evans rats during active displacements in a foraging task as well as during passive rotations. Since thalamic HD cells typically stop firing if the animals are tightly restrained, the rats were trained to remain immobile while drinking water distributed at intervals from a small reservoir at the center of a rotatable platform. The platform was rotated in a clockwise/counterclockwise oscillation to record directional responses in the stationary animals while the surrounding environmental cues remained stable. The peak rate of directional firing decreased by 27% on average during passive rotations ( r 2 = 0.73, P< 0.001). Individual cells recorded in sequential sessions ( n = 8) reliably showed comparable reductions in peak firing, but simultaneously recorded cells did not necessarily produce identical responses. All of the HD cells maintained the same preferred directions during passive rotations. These results are consistent with the hypothesis that the level of locomotor activity provides a state-dependent modulation of the response magnitude of AD HD cells. This could result from diffusely projecting neuromodulatory systems associated with motor state.


2021 ◽  
Vol 14 ◽  
Author(s):  
Umer Saleem Bhat ◽  
Navneet Shahi ◽  
Siju Surendran ◽  
Kavita Babu

One of the reasons that most multicellular animals survive and thrive is because of the adaptable and plastic nature of their nervous systems. For an organism to survive, it is essential for the animal to respond and adapt to environmental changes. This is achieved by sensing external cues and translating them into behaviors through changes in synaptic activity. The nervous system plays a crucial role in constantly evaluating environmental cues and allowing for behavioral plasticity in the organism. Multiple neurotransmitters and neuropeptides have been implicated as key players for integrating sensory information to produce the desired output. Because of its simple nervous system and well-established neuronal connectome, C. elegans acts as an excellent model to understand the mechanisms underlying behavioral plasticity. Here, we critically review how neuropeptides modulate a wide range of behaviors by allowing for changes in neuronal and synaptic signaling. This review will have a specific focus on feeding, mating, sleep, addiction, learning and locomotory behaviors in C. elegans. With a view to understand evolutionary relationships, we explore the functions and associated pathophysiology of C. elegans neuropeptides that are conserved across different phyla. Further, we discuss the mechanisms of neuropeptidergic signaling and how these signals are regulated in different behaviors. Finally, we attempt to provide insight into developing potential therapeutics for neuropeptide-related disorders.


2018 ◽  
Vol 5 (2) ◽  
pp. 171785 ◽  
Author(s):  
Martin F. Strube-Bloss ◽  
Wolfgang Rössler

Flowers attract pollinating insects like honeybees by sophisticated compositions of olfactory and visual cues. Using honeybees as a model to study olfactory–visual integration at the neuronal level, we focused on mushroom body (MB) output neurons (MBON). From a neuronal circuit perspective, MBONs represent a prominent level of sensory-modality convergence in the insect brain. We established an experimental design allowing electrophysiological characterization of olfactory, visual, as well as olfactory–visual induced activation of individual MBONs. Despite the obvious convergence of olfactory and visual pathways in the MB, we found numerous unimodal MBONs. However, a substantial proportion of MBONs (32%) responded to both modalities and thus integrated olfactory–visual information across MB input layers. In these neurons, representation of the olfactory–visual compound was significantly increased compared with that of single components, suggesting an additive, but nonlinear integration. Population analyses of olfactory–visual MBONs revealed three categories: (i) olfactory, (ii) visual and (iii) olfactory–visual compound stimuli. Interestingly, no significant differentiation was apparent regarding different stimulus qualities within these categories. We conclude that encoding of stimulus quality within a modality is largely completed at the level of MB input, and information at the MB output is integrated across modalities to efficiently categorize sensory information for downstream behavioural decision processing.


Author(s):  
Geoffrey S. Hubona ◽  
Gregory W. Shirah

Most computer applications feature visual user interfaces that assume that all users have equivalent propensities to perceive, interpret, and understand the multidimensional spatial properties and relationships of the objects presented. However, the hunter-gatherer theory (Silverman & Eals, 1992) suggests that there are modern-day differences between the genders in spatial and cognitive abilities that stem from differentiated prehistoric sex roles. If true, there may be discrepancies in how males and females differentially utilize particular spatial visual cues and interface features. We report three experiments in which participants engage in visual spatial tasks using 2D and 3D virtual worlds: (1) matching object shapes; (2) positioning objects; and (3) resizing objects. Female subjects under-perform male subjects in the matching and positioning experiments, but they outperform male subjects in the resizing experiment. Moreover, male subjects make more use of motion cues. Implications for the design of gender-effective user interfaces and virtual environments are considered.


2021 ◽  
pp. 182-188
Author(s):  
Laura Bishop ◽  
Carlos Cancino-Chacón ◽  
Werner Goebl

In the Western art music tradition, among many others, top ensembles are distinguished on the basis of their creative interpretation and expressivity, rather than purely on the precision of their synchronization. This chapter proposes that visual cues serve as social motivators during ensemble performance, promoting performers’ creative engagement with the music and each other. This chapter discusses findings from a study in which skilled duo musicians’ use of visual cues (eye gaze and body motion) was examined across the course of a rehearsal session. Results show that performers are driven to interact visually: (1) by temporal irregularity in the music and (2) by increased familiarity with the music and their co-performer. Synchronization success was unimpaired during a “blind” performance where performers could not see each other. Ensemble musicians thus choose to supplement their auditory interactions with visual cues despite their visual interactions offering no apparent benefit to synchronization.


2017 ◽  
Vol 51 (5) ◽  
pp. 103-115 ◽  
Author(s):  
Kevin Nelson ◽  
Kamran Mohseni

AbstractThis paper presents a sensory system that is biologically inspired by the lateral line sensory system found in fish. This artificial lateral line system provides sensory information to be used in vehicle control algorithms, both to reduce model complexity and to measure hydrodynamic disturbances. The system presented in this paper is a modular implementation that can fit around a vehicle without requiring modifications to the hull. The design and manufacturing processes are presented in detail along with considerations for sensor placement and port spacing. An algorithm for calculating the hydrodynamic forces acting on the surface of a vehicle is derived and experimentally validated. An underwater motion capture system and strain sensors are used to calculate a reference hydrodynamic force that compares favorably with the hydrodynamic force calculated by the lateral line system.


Sign in / Sign up

Export Citation Format

Share Document