insect locomotion
Recently Published Documents


TOTAL DOCUMENTS

41
(FIVE YEARS 3)

H-INDEX

16
(FIVE YEARS 0)

2021 ◽  
Vol 31 (20) ◽  
pp. R1395-R1397
Author(s):  
Manuel Zimmer
Keyword(s):  

Author(s):  
Clarissa Goldsmith ◽  
Roger D Quinn ◽  
Nicholas Stephen Szczecinski
Keyword(s):  

2021 ◽  
Vol 15 ◽  
Author(s):  
Ilja Arent ◽  
Florian P. Schmidt ◽  
Mario Botsch ◽  
Volker Dürr

Motion capture of unrestrained moving animals is a major analytic tool in neuroethology and behavioral physiology. At present, several motion capture methodologies have been developed, all of which have particular limitations regarding experimental application. Whereas marker-based motion capture systems are very robust and easily adjusted to suit different setups, tracked species, or body parts, they cannot be applied in experimental situations where markers obstruct the natural behavior (e.g., when tracking delicate, elastic, and/or sensitive body structures). On the other hand, marker-less motion capture systems typically require setup- and animal-specific adjustments, for example by means of tailored image processing, decision heuristics, and/or machine learning of specific sample data. Among the latter, deep-learning approaches have become very popular because of their applicability to virtually any sample of video data. Nevertheless, concise evaluation of their training requirements has rarely been done, particularly with regard to the transfer of trained networks from one application to another. To address this issue, the present study uses insect locomotion as a showcase example for systematic evaluation of variation and augmentation of the training data. For that, we use artificially generated video sequences with known combinations of observed, real animal postures and randomized body position, orientation, and size. Moreover, we evaluate the generalization ability of networks that have been pre-trained on synthetic videos to video recordings of real walking insects, and estimate the benefit in terms of reduced requirement for manual annotation. We show that tracking performance is affected only little by scaling factors ranging from 0.5 to 1.5. As expected from convolutional networks, the translation of the animal has no effect. On the other hand, we show that sufficient variation of rotation in the training data is essential for performance, and make concise suggestions about how much variation is required. Our results on transfer from synthetic to real videos show that pre-training reduces the amount of necessary manual annotation by about 50%.


2020 ◽  
Vol 80 (1-2) ◽  
pp. 16-30 ◽  
Author(s):  
Charalampos Mantziaris ◽  
Till Bockemühl ◽  
Ansgar Büschges
Keyword(s):  

Author(s):  
Trinayan Barthakur ◽  
Susmita Dey ◽  
Arijit Chakraborty
Keyword(s):  

2019 ◽  
Vol 114 (1) ◽  
pp. 23-41 ◽  
Author(s):  
Mantas Naris ◽  
Nicholas S. Szczecinski ◽  
Roger D. Quinn

2016 ◽  
Vol 13 (116) ◽  
pp. 20160060 ◽  
Author(s):  
Feng Cao ◽  
Chao Zhang ◽  
Hao Yu Choo ◽  
Hirotaka Sato

We have constructed an insect–computer hybrid legged robot using a living beetle ( Mecynorrhina torquata ; Coleoptera). The protraction/retraction and levation/depression motions in both forelegs of the beetle were elicited by electrically stimulating eight corresponding leg muscles via eight pairs of implanted electrodes. To perform a defined walking gait (e.g. gallop), different muscles were individually stimulated in a predefined sequence using a microcontroller. Different walking gaits were performed by reordering the applied stimulation signals (i.e. applying different sequences). By varying the duration of the stimulation sequences, we successfully controlled the step frequency and hence the beetle's walking speed. To the best of our knowledge, this paper presents the first demonstration of living insect locomotion control with a user-adjustable walking gait, step length and walking speed.


2016 ◽  
Vol 115 (2) ◽  
pp. 887-906 ◽  
Author(s):  
T. I. Tóth ◽  
S. Daun-Gruhn

Insect locomotion requires the precise coordination of the movement of all six legs. Detailed investigations have revealed that the movement of the legs is controlled by local dedicated neuronal networks, which interact to produce walking of the animal. The stick insect is well suited to experimental investigations aimed at understanding the mechanisms of insect locomotion. Beside the experimental approach, models have also been constructed to elucidate those mechanisms. Here, we describe a model that replicates both the tetrapod and tripod coordination pattern of three ipsilateral legs. The model is based on an earlier insect leg model, which includes the three main leg joints, three antagonistic muscle pairs, and their local neuronal control networks. These networks are coupled via angular signals to establish intraleg coordination of the three neuromuscular systems during locomotion. In the present three-leg model, we coupled three such leg models, representing front, middle, and hind leg, in this way. The coupling was between the levator-depressor local control networks of the three legs. The model could successfully simulate tetrapod and tripod coordination patterns, as well as the transition between them. The simulations showed that for the interleg coordination during tripod, the position signals of the levator-depressor neuromuscular systems sent between the legs were sufficient, while in tetrapod, additional information on the angular velocities in the same system was necessary, and together with the position information also sufficient. We therefore suggest that, during stepping, the connections between the levator-depressor neuromuscular systems of the different legs are of primary importance.


Sign in / Sign up

Export Citation Format

Share Document