20 Billion FPS camera? HOLY MOSES!
The system devised by PhD student Genevieve Gariepy and colleagues at Heriot-Watt University is doing the same basic thing, but in air and much, much faster. Air scatters less light than fog or smoke would, so the camera used to capture the video below needed to be extremely sensitive. It’s composed of a 32×32 pixel grid of single-photon avalanche diode (SPAD) detectors. This type of CMOS sensor was chosen because it has high temporal resolution. That means the image data acquired by them can be accurately correlated with real life.
So how accurate is the SPAD camera system? It can has a temporal resolution of 67 picoseconds, or 0.000000000067 seconds. Looking at it another way, it can capture about 20 billion frames per second. That’s good enough to pinpoint a single photon within a few centimeters. When you’re talking about light, which is literally the fastest thing in the universe, that’s fantastically impressive. The camera is also tied into the laser emitter so it knows the exact time the pulse is produced.
The video of the laser bouncing shows a bit of haze around the beam, but that’s a consequence of the method used to capture it. Even though the SPAD camera is extremely sensitive, it still only picks up a few photons scattered from the beam. The video is actually a blending of 2 million pulses fired over the course of 10 minutes. The team simply had to combine all that image data and subtract the background. The result is the video you see above.