The Hanover Messe 2010 event saw DFKI Bremen showcase a new humanoid robot known as AILA, where it shows how robots can theoretically see action in dynamic environments with humans by the time 2020 rolls around. AILA will rely on a system that uses SemProM (Semantic Product Memory), where it will merge its computer vision when handling objects of varying shapes and sizes. For example, AILA is able to adjust how it holds a bottle based on its weight and fragility. A couple of laser range finders enable stereo vision in its head, while a 3D camera works great for object recognition and orientation to function in tandem with an RFID reader in its left hand. Check out AILA in the action video above.