{Reference Type}: Journal Article {Title}: Mixed Representations of Sound and Action in the Auditory Midbrain. {Author}: Quass GL;Rogalla MM;Ford AN;Apostolides PF; {Journal}: J Neurosci {Volume}: 44 {Issue}: 30 {Year}: 2024 Jul 24 {Factor}: 6.709 {DOI}: 10.1523/JNEUROSCI.1831-23.2024 {Abstract}: Linking sensory input and its consequences is a fundamental brain operation. During behavior, the neural activity of neocortical and limbic systems often reflects dynamic combinations of sensory and task-dependent variables, and these "mixed representations" are suggested to be important for perception, learning, and plasticity. However, the extent to which such integrative computations might occur outside of the forebrain is less clear. Here, we conduct cellular-resolution two-photon Ca2+ imaging in the superficial "shell" layers of the inferior colliculus (IC), as head-fixed mice of either sex perform a reward-based psychometric auditory task. We find that the activity of individual shell IC neurons jointly reflects auditory cues, mice's actions, and behavioral trial outcomes, such that trajectories of neural population activity diverge depending on mice's behavioral choice. Consequently, simple classifier models trained on shell IC neuron activity can predict trial-by-trial outcomes, even when training data are restricted to neural activity occurring prior to mice's instrumental actions. Thus, in behaving mice, auditory midbrain neurons transmit a population code that reflects a joint representation of sound, actions, and task-dependent variables.