The ability to recognize objects and events by interpreting sound signals is one of the fundamental qualities of human. The ability to categorize objects and events using auditory signals is extremely important, but it is a difficult task in robots. In this paper, different supervised learning methods using distinctive features from sound data were compared as part of a system for robots to clasify objects and events automatically using auditory features of environmental sound.
Our experimental setting involved objects from different materials including glass, metal, porcelain, cardboard and plastic. We first analyzed the performance of the supervised learning methods with our proposed feature set on material categorization. Then, we investigated the performance of the learning methods for categorization of event outcomes. We used two different robotic platforms: a wheeled mobile robot and a 7-DOF robotic arm. The proposed system achieved over 91% success in classifying materials and events.