Rapid geometric feature signaling in the spiking activity of a complete population of tactile nerve fibers
AbstractTactile feature extraction is essential to guide the dexterous manipulation of objects. The longstanding theory is that geometric features at each location of contact between hand and object are extracted from the spatial layout of the response of populations of tactile nerve fibers. However, recent evidence suggests that some features (edge orientation, e.g.) are extracted very rapidly (<200ms), casting doubt that this information relies on a spatial code, which ostensibly requires integrating responses over time. An alternative hypothesis is that orientation is conveyed in precise temporal spiking patterns. Here, we simulate, using a recently developed and validated model, the responses of tactile fibers from the entire human fingertip (∼800 afferents) to edges indented into the skin. We show that edge orientation can be quickly (<50 ms) and accurately (<3°) decoded from the spatial pattern of activation across the afferent population, starting with the very first spike. Next, we implement a biomimetic decoder of edge orientation, consisting of a bank of oriented Gabor filters, designed to mimic the documented responses of cortical neurons. We find that the biomimetic approach leads to orientation decoding performance that approaches the limit set by optimal decoders and is actually more robust to changes in other stimulus features. Finally, we show that orientation signals, measured from single units in non-human primate cortex (2 macaque monkeys, 1 female), follow a time course consistent with that of their counterparts in the nerve. We conclude that a spatial code is fast and accurate enough to support object manipulation.