A predictive coding framework for a developmental agent: Speech motor skill acquisition and speech production

Abstract

Predictive coding has been hypothesized as a universal principle guiding the operation in different brain areas. In this paper, a predictive coding framework for a developmental agent with perception (audio), action (vocalization), and learning capabilities is proposed. The agent learns concurrently to plan optimally and the associations between sensory and motor parameters, by minimizing the sensory prediction error in an unsupervised manner. The proposed agent is solely driven by sensory prediction error and does not require reinforcement. It learns initially by self-exploration and later by imitation from the ambient environment. Our goal is to investigate the process of speech motor skill acquisition and speech production in such an agent. Standard vocal exploration experiments show that it learns to generate speech-like sounds (acoustic babbling followed by proto-syllables and vowels) as well as the timing for motor command execution. Random goal exploration leads to the self-organization of developmental stages of vocal sequences in the agent due to increase in complexity of vocalization. The self-organization is invariant to certain acoustic feature representations. Self-exploration allows the agent to learn to imitate environmental sounds quickly. It learns to vocalize differently in different environments.

Publication Title

Speech Communication

Share

COinS