Event sensors output a stream of asynchronous brightness changes (called “events”) at a very high temporal rate. Previous works on recovering the lost intensity information from the event sensor data have heavily relied on the event stream, which makes the reconstructed images nonphotorealistic and also susceptible to noise in the event stream. A method is proposed to reconstruct photorealistic intensity images from a hybrid sensor consisting of a low-frame-rate conventional camera and the event sensor. The texture-rich information from a traditional image sensor with the motion-rich information from the event sensor is exploited, producing photorealistic high-frame-rate videos. To accomplish the task, the low-frame-rate intensity images are warped to temporally dense locations of the event data. The results obtained from the proposed algorithm are more photorealistic compared to any of the previous state-of-the-art algorithms. The algorithm’s robustness to abrupt camera motion and noise in the event sensor data is also demonstrated. |
ACCESS THE FULL ARTICLE
No SPIE Account? Create one
CITATIONS
Cited by 31 scholarly publications.
Sensors
Image sensors
Image restoration
Reconstruction algorithms
Cameras
Video
Optical flow