23 July 2023

Interesting Tech

It turns that animals, including to a lesser extent humans, don't use all of the sensors at the back of the eye.

Specifically, the rods or cones, or whatever insects in their eyes will only send a signal when there is a change in brightens, which is why dragonflies, who have a tiny brain, and even smaller eyes, do so well at catching mosquitos.

People are developing neuromorphic cameras to maximize performance with a smaller number of light sensors in a camera. (Paid subscription required)

The idea is to put a tiny amount of intelligence in each pixel which then only send data upon a change:

When a dragonfly hunts a mosquito, it does not chase the insect; the four-winged arthropod calculates where the bloodsucker is going and intercepts it. Dragonflies have a catch rate of more than 95%—one of the highest in the animal kingdom.

“It does that with a tiny brain, terrible visual system and a microwatt of power,” says Gregory Cohen, deputy director of the International Center for Neuromorphic Systems at Western Sydney University (WSU) in Australia. “That’s a problem we really struggled to deal with using full supercomputers, graphics processing units and all this stuff we have right now. So clearly the dragonfly is doing it the easy way.”

Dragonflies and other living organisms, including humans, are the inspiration for the U.S. Air Force Research Laboratory’s (AFRL) Falcon Neuro, a pair of experimental neuromorphic cameras that have been flying mounted on the outside of the International Space Station (ISS) since 2022. The laboratory calls the mission the first demonstration of neuromorphic cameras in space.

Whereas a conventional video camera continuously records video using all its pixels, a neuromorphic camera—also known as an event camera—records data only from pixels that sense a change in light. The method echoes how human eyes transmit more signals to the brain in response to changes but fewer for static scenery.

………

Falcon Neuro’s cameras are based on Inivation’s Davis 240C, customized by WSU and controlled by electronics developed by U.S. Air Force Academy cadets. From its perch on the ISS, the neuromorphic cameras’ primary mission has been recording lightning and sprites streaking across the planet’s atmosphere.

“If you happen to have a normal camera that takes frames, you have to be really lucky to catch [a photo of lightning],” Cohen says. “Our camera sensor can be really fast, [with] really low power and a really low data rate.”

The Falcon Neuro camera captures events that occur in fractions of a second, such as a lightning flash, across relatively long periods of time.

“Exactly how fast you can render the data depends on how much contrast there is between the target and the background,” says Matthew McHarg, principal investigator for Falcon Neuro and director of the Air Force Academy’s Space Physics and Atmospheric Research Center. “Lightning has a lot of contrast, allowing Neuro to render at 1,000 [frames per second].”

A comparable high-speed camera might capture only seconds of footage before running out of memory and could cost hundreds of thousands of dollars, Cohen says. “You get the benefit of a high-speed camera without all the costs,” he explains.

 This is a fascinating, and very low cost, application of parallelism to get results.

0 comments :

Post a Comment