Scanning the interactively speaking brain: the cases of synchronous speech and conversation in autism.
Speech is an important social behavior. However, neuroimaging studies of speech have mainly used tasks that are not social, such as listening to recordings of speech, or talking alone. Is brain activity during speaking or listening alone different from speaking and listening to another person in a live interaction setting? If there are differences, what are they, and do they matter? I will argue that brain activity during social speaking is importantly different, with data from two studies. The first study investigated synchronous speaking (chanting, praying, pledging), a ubiquitous form of joint speech seen in many human cultures, often in settings where group cohesion is important. We asked subjects to speak sentences at the same as another person while being scanned with fMRI. We found that activity during live synchronous speaking was strikingly different from speaking or listening alone, or with a covertly-substituted recorded voice. Furthermore, although solo speech production attenuates auditory regions, synchronous speech produced with another live person does not. Auditory cortex instead responds as though the subject were just listening to someone else speak. I will therefore suggest that synchrony’s use as a social tool may relate to a change in the brain’s ability to distinguish self from other. The second study used a live conversation task to investigate social deficits in people with autism spectrum disorders. Autistic males (aged 14-30) and matched controls engaged in multi-modal conversation (video and audio) with another person while they were scanned with fMRI. Although most fMRI studies of autism that use social stimuli result in weak responses or decreased functional connectivity relative to controls, preliminary results indicate that sensory, motor, and social brain regions respond more strongly and more synchronously in autistics than controls, in a live interaction situation. Results of these experiments suggest that two-person interaction tasks reveal important aspects of the speaking brain that are invisible to one-person tasks.