Having to create a fully autonomous flying robot is a challenging task. Proposing a biomimetic approach makes it even harder. For my master thesis in the Cognitive Systems and Interactive Media MSc at Universitat Pompeu Fabra (UPF) of Barcelona, I had the tasks of creating vision and movement modules, based on existing studies of the visual system of the insects.
This has led to the extended abstract of the study, which has been published in the proceedings of the 1st Living Machines conference in Barcelona (2012). You can find the study here.
|The arena||Thesis Method||In flight|
So far, the neuronal simulator iqr can communicate with the Ar.Drone quadcopter by using the developed modules for the frontal and bottom cameras, the 4 DOF movement and the navigational data output.
The models used, manage to retrieve information about the speed, direction and looming stimuli from the environment, by using a sole sensor; the camera. This approach is able to get basic navigational information and behavior without using external sensors and with a proper behavioral layer to achieve task autonomy.
The video below demonstrates the functionality of the modules, where the drone follows the blue ball (pitch) and moves according with the red (yaw, gaz). The second part demonstrates the experiments made by using the models of the visual system of the insects (fly, locust).