General information:
Position: Master internship or last year engineer’s degree
Duration: 4 to 6 months, starting in February 2025 (flexible)
Location: IMT Atlantique, Brest (France)
Affiliation: RAMBO team, Lab-STICC (potential collaboration with LS2N at Centrale Nantes)
Context:
This project will be conducted in the context of LEASARD project [1] which aims to increase the navigation autonomy of drones as Unmanned Aerial Vehicles (UAV) in search and rescue scenarios. Towards this goal, LEASARD project is dedicated to the enhancement of sensing and processing capabilities through the integration of event cameras and the utilization of appropriate deep neural networks to process data from these cameras.
Description and objectives:
This internship offers the opportunity to work on cutting-edge drone technology, explore innovative multi-modal solutions, and contribute to advancements in visual tracking and autonomous systems. Specifically, it focuses on developing an advanced drone controller capable of selecting optimal policies for following a target. The visual tracking will utilize data from sensors mounted on the drone. The goal is to map this sensor data into precise drone actions, enabling robust and efficient target tracking.
The work is part of the LEASARD project, where a first solution was proposed using deep reinforcement learning and event cameras [2]. Event cameras, also known as Dynamic Vision Sensors (DVS), were chosen for their unique advantages: low latency, free from motion blur, energy efficiency, and a high dynamic range, all of which make them particularly suited for drone-based video processing [3]. This research work demonstrated the effectiveness of event cameras in handling low-light conditions and rapid movements.
Building on this foundation, the objective of this internship is to extend the previous work and develop a more comprehensive system that combines the strengths of event cameras with traditional RGB cameras. While event cameras perform well in challenging lighting and high-speed scenarios, RGB cameras often perform better under normal conditions. The candidate will focus on designing a multi-modal solution that integrates both modalities, harnessing their complementary benefits.
Training is usually conducted in simulated environments to ensure safety and enable extensive exploration. However, the project will also emphasize real-world evaluation of the proposed solution to validate its performance in practical scenarios [4].
Candidate profile:
- The candidate is pursuing his/her last year of Master's or engineer’s degree. The balance between research and development will be determined based on the candidate's profile.
- A strong level of Python programming is required.
- An interest in deep learning frameworks is also required.
- Good oral and written communication skills in English.
How to apply:
Interested candidates are encouraged to send their applications (detailed CV, and transcripts) as soon as possible to the following address: hajer.fradi@imt-atlantique.fr
References:
[1] LEASARD project, https://project.inria.fr/leasard/
[2] Souissi, A., Fradi, H., & Papadakis, P. (2024). Leveraging Event Streams with Deep Reinforcement Learning for End-to-End UAV Tracking. arXiv preprint arXiv:2410.14685.
[3] Gallego, G., Delbrück, T., Orchard, G., Bartolozzi, C., Taba, B., Censi, A., & Scaramuzza, D. (2020). Event-based vision: A survey. IEEE transactions on pattern analysis and machine intelligence, 44(1), 154-180.
[4] Loquercio, A., Kaufmann, E., Ranftl, R., Dosovitskiy, A., Koltun, V., & Scaramuzza, D. (2019). Deep drone racing: From simulation to reality with domain randomization. IEEE Transactions on Robotics, 36(1), 1-14.
(c) GdR IASIS - CNRS - 2024.