General information:
Position: End-of study project
Duration: 4 to 6 months, starting in February 2025 (flexible)
Location: IMT Atlantique, Brest (France)
Affiliation: RAMBO team, Lab-STICC
Supervisors: Hajer Fradi and Panagiotis Papadakis
Context:
This project will be conducted in the context of LEASARD project [1] which aims to the enhancement of sensing and processing capabilities of drones through the use of event cameras and the utilization of appropriate deep neural networks to process data from these cameras.
Description and objectives:
This project focuses on action recognition for human-centered applications using event cameras. While action recognition with standard videos has been extensively studied in fields like surveillance, healthcare, and abnormal activity detection [2], RGB-based video analysis raises significant privacy concerns, especially indoors, as frame-based images expose users’ appearance and identity. These challenges hinder the deployment of traditional action recognition methods in real-world applications.
To address these issues, we aim to leverage bio-inspired Dynamic Vision Sensors (DVS), also known as event cameras [3]. Unlike conventional cameras, DVS sensors capture motion with microsecond-level latency, are free from color and texture features that reveal personal identity, and require less energy and computational resources. Additionally, these sensors have high dynamic range (140 dB vs. 60 dB for RGB cameras), enabling reliable performance in challenging lighting or weather conditions. Although used in autonomous driving and drone navigation, their potential in human-centered applications remains underexplored.
This internship will explore DVS sensors for action recognition by developing an end-to-end architecture that learns a relevant event representation and incorporates temporal information for improved recognition. The candidate will work with the recently published event-based action recognition benchmark, THUE-ACT-50, including 50 action categories [4].
Candidate profile:
- The candidate is pursuing his/her last year of Master's or engineer’s degree. The balance between research and development will be determined based on the candidate's profile.
- A strong level of Python programming is required.
- An interest in deep learning frameworks (Pytorch) is also required.
- Good oral and written communication skills in English
How to apply:
Interested candidates are encouraged to send their applications (detailed CV and transcripts) as soon as possible to the following address: hajer.fradi@imt-atlantique.fr
References:
[1] LEASARD project, https://project.inria.fr/leasard/
[2] Gu, F., Chung, M. H., Chignell, M., Valaee, S., Zhou, B., & Liu, X. (2021). A survey on deep learning for human activity recognition. ACM Computing Surveys (CSUR), 54(8), 1-34.
[3] Gallego, G., Delbrück, T., Orchard, G., Bartolozzi, C., Taba, B., Censi, A., ... & Scaramuzza, D. (2020). Event-based vision: A survey. IEEE transactions on pattern analysis and machine intelligence, 44(1), 154-180.
[4] Gao, Y., Lu, J., Li, S., Ma, N., Du, S., Li, Y. and Dai, Q, Action Recognition and Benchmark Using Event Cameras, in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 12, pp. 14081-14097, Dec. 2023
(c) GdR IASIS - CNRS - 2024.