Invited Speaker 1

 
 
 
 

Prof. Alexander Stoytchev

Iowa State University, USA

Alexander Stoytchev is an Assistant Professor of Electrical and Computer Engineering and the Director of the Developmental Robotics Laboratory at Iowa State University, USA. He received his MS and PhD degrees in computer science from the Georgia Institute of Technology in 2001 and 2007, respectively. His research interests are in the areas of developmental robotics, autonomous robotics, computational perception, and machine learning.

For more information visit his web page: http://www.ece.iastate.edu/~alexs/

 

Invited Talk: Developmental Approach to Robotic Intelligence

Abstract: Developmental robotics is an emerging field that blends the boundaries between robotics, artificial intelligence, developmental psychology, and philosophy. The basic research hypothesis of developmental robotics is that truly intelligent robot behavior cannot be achieved in the absence of a prolonged interaction with a physical or a social environment. In other words, robots must undergo a developmental period similar to that of humans and animals. Using a new paradigm based on observations of how human infants develop their skills, the goal of this field is to create autonomous robots that are more intelligent, more adaptable, and more useful than the robots of today, which can only function in very limited domains and situations.

This talk will focus on recent research results that show how a robot can solve multiple tasks based on what it learns during a developmental period similar to a child’s play. During this period the robot actively tries to grasp, lift, shake, touch, scratch, tap, push, drop, and crush objects. At the end of this period the robot knows what different objects sound like when they are dropped, feel like when they are squeezed, etc. Because these properties are grounded in the robot’s sensorimotor repertoire the robot can autonomously learn, test, and verify its own representations without human intervention. The talk will demonstrate how the robot can use this information to recognize objects, separate objects into functional categories, and even find the odd-one-out in a set of objects. The talk will also demonstrate how the robot can use sensorimotor interactions to bootstrap the development of its visual system in the context of a button-pressing task.

Results and videos will be presented for two different humanoid platforms.

 

Sample Video

A humanoid robot learns to push buttons, by learning to detect the functional components of doorbell buttons using active exploration technique. This video was presented by V. Sukhoy and A. Stoytchev at Humanoids 2010 conference in Nashville, USA.