The differential role of slow and fast cortical oscillations in encoding naturalistic visual information
Understanding how networks of neurons represent changes in the environment around us is a fundamental prerequisite to understand how the brain implements sensory functions. In this talk I will present the computational work of my group aimed at evaluating the hypothesis that cortical oscillations play an important role in the representation and processing of sensory events. The computational approach is based on developing analysis methods founded on the principles of information theory, and by combining them with realistic simulations of cortical dynamics. This work provided evidence that the primary visual cortex uses two complementary frequency channels to carry information about the natural visual environment: a low frequency channel (few Hz) mainly devoted to encoding the hystory of stimulus changes over slow time scales, and a faster frequency channel (40- 100 Hz) mainly devoted to encode current attributes of visual stimulation, such as the current strength of contrast in the receptive field. These results lay the basis to understand how sensory cortices may exploit the wide ranges of frequencies expressed by cortical microcircuits to multiplex different information over several time scales, thereby increasing the circuit's capacity to transmit information.