iCub Research & Projects My Projects

The iCub humanoid robot is an open-system, meaning both open-source and open-hardware, robotic platform developed in EU funded FP7 research projects. It provides (in our setup) a 41 degree-of-freedom (DOF) upper-body, comprising two arms, a head and a torso. The iCub is generally considered an interesting experimental platform for cognitive and sensorimotor development and embodied Artificial Intelligence research, with a focus on object manipulation.

iCub

Research & Outcomes

The overarching goal of my iCub related research is to create robots with more autonomous and more adaptive behaviours, leading to more `intelligent´ robots. My research is focussed at the intersection between IDSIA's Machine Learning (Reinforcement Learning) group and robotics. My goal was to make the iCub see, that is, developing computer vision algorithms for object detection, identification and localisation.

Software Frameworks

Recently icVision, my modular computer vision system used in our cognitive robotics research, was released. Have a look at the project page if you are interested. It is an easy-to-use, modular framework performing vision related tasks on the iCub humanoid robot for research experiments.

LEOGrasper is now available as well. It is our light-weight, easy-to-use, one-shot grasping system for the iCub. It's been used extensively at IDSIA, especially for the IM-CLeVeR video.

Code

come back in a bit to find code here :)

In the meantime check here


Perception and Sensorimotor Coordination

We submitted a paper to the WCCI 2014 (IJCNN Special Session on Robot Vision) on our research about choosing actions to improve object detection in robotic vision. The video is here:

 

Intrinsically Motivated Agents (IM-CLeVeR)

My PhD is partly funded by the EU FP7 project IM-CLeVeR, which includes partners from CNR (in Rome), Universities of Ulster and Aberstwyth (UK). This project focusses on the iCub robot and how it can learn to interact with its environment. At IDSIA we put a lot of work into making the iCub learn how to perform object manipulation.

Tele-operation of a Complex Humanoid

The operation of complicated advanced robot systems is not a trivial task. How an operator can perform certain tasks intuitively and without breaking the robot
We worked on this and are now able to control the iCub with a variety of input devices, using e.g. a LEAP motino device, a six-axis joy-pad or even a simple accelerometer (such as the one in your phone)!

More cool things

Stereo-Vision Depth Perception in Humanoid Robots

This project in 2007 at IST in Lisbon aimed to give an the experimental evaluation of a dense disparity estimation algorithm, focusing on the most relevant aspects for humanoid robots: real-time functionality and ability to deal with calibration errors. The method and its real time implementation (using C++ and open source libraries) are tested in a variety of cases to show its performance and the quality of its output as a function of the design parameters. An early version of the iCub hardware (mounted on-top a torso with one robotic arm) together with the iCub abstractions for the hardware (YARP) were used.

Shakey Award Winner - AAAI 2013 Video

Our video showing the iCub doing cool motions was awarded Best Student Video at the AAAI Video Competition!!