The world view of rodents is largely determined by sensation on two length scales. One is within the animal's peri-personal space; sensorimotor control on this scale involves active movements of the nose, tongue, head, and vibrissa, along with sniffing to determine olfactory clues. The second scale involves the detection of more distant space through vision and audition; these detection processes also impact repositioning of the head, eyes, and ears. Here we focus on orofacial motor actions, primarily vibrissa-based touch but including nose twitching, head bobbing, and licking, that control sensation at short, peri-personal distances. The orofacial nuclei for control of the motor plants, as well as primary and secondary sensory nuclei associated with these motor actions, lie within the hindbrain. The current data support three themes: First, the position of the sensors is determined by the summation of two drive signals, i.e., a fast rhythmic component and an evolving orienting component. Second, the rhythmic component is coordinated across all orofacial motor actions and is phase-locked to sniffing as the animal explores. Reverse engineering reveals that the preBötzinger inspiratory complex provides the reset to the relevant premotor oscillators. Third, direct feedback from somatosensory trigeminal nuclei can rapidly alter motion of the sensors. This feedback is disynaptic and can be tuned by high-level inputs. A holistic model for the coordination of orofacial motor actions into behaviors will encompass feedback pathways through the midbrain and forebrain, as well as hindbrain areas.
Keywords: coupled oscillators; facial nucleus; hypoglossal nucleus; licking; orienting; tongue; vibrissa.
Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.