Date of this Version
Published in Journal of Speech, Language, and Hearing Research 62 (2019), pp 2133–2140.
Purpose: Speech motor control relies on neural processes for generating sensory expectations using an efference copy mechanism to maintain accurate productions. The N100 auditory event-related potential (ERP) has been identified as a possible neural marker of the efference copy with a reduced amplitude during active listening while speaking when compared to passive listening. This study investigates N100 suppression while controlling a motor imagery speech synthesizer brain–computer interface (BCI) with instantaneous auditory feedback to determine whether similar mechanisms are used for monitoring BCI-based speech output that may both support BCI learning through existing speech motor networks and be used as a clinical marker for the speech network integrity in individuals without severe speech and physical impairments.
Method: The motor-induced N100 suppression is examined based on data from 10 participants who controlled a BCI speech synthesizer using limb motor imagery. We considered listening to auditory target stimuli (without motor imagery) in the BCI study as passive listening and listening to BCI-controlled speech output (with motor imagery) as active listening since audio output depends on imagined movements. The resulting ERP was assessed for statistical significance using a mixed-effects general linear model.
Results: Statistically significant N100 ERP amplitude differences were observed between active and passive listening during the BCI task. Post hoc analyses confirm the N100 amplitude was suppressed during active listening.
Conclusion: Observation of the N100 suppression suggests motor planning brain networks are active as participants control the BCI synthesizer, which may aid speech BCI mastery.