Event-based Optical Flow Prediction with Spiking Neural Networks
Text Complet
Compartir
Event-based Cameras (EBCs), also known as dynamic vision sensors or neuromorphic cameras, are imaging sensors that operate differently from traditional frame-based cameras. Inspired by the exceptional motion perception abilities of winged insects, these cameras respond asynchronously to brightness or intensity changes with microsecond resolution. For each brightness change, the EBC outputs a discrete packet of information called ”event”. EBCs offer significant advantages over traditional frame-based cameras: no motion blur, high dynamic range, high temporal resolution, and low latency. These unique properties make the EBC an ideal sensor to analyze dynamic scenes characterized by fast motion and rapid changes in lighting conditions. As such, these bio-inspired sensors are subject to thorough research in the field of machine learning and robotics, offering fast, efficient, and robust processing for various applications: optical flow, visual odometry, feature tracking, object recognition, and even 3D reconstruction.
The ultimate goal of this thesis is the foundation of a well integrated pipeline to operate EBCs with SNNs. We prove that EBC can perform well with NNs, more specifically Convolutional Neural Networks (CNNs), and CNNs-SNNs hybrids that predict optical flow. Ultimately, we want to use neuromorphic hardware to implement these SNN models, taking advantage of the asynchronous nature of EBCs and reducing power consumption to a minimum. Moreover, this thesis provides insight on how NN models can operate with event-based data, laying the groundwork for future expenditures in the field of deep learning with EBCs.