Close

Search

Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

DARPA Taps Teams to Develop Neuromorphic Imagers

Facebook Twitter LinkedIn Email
DARPA has announced three teams of researchers led by Raytheon, BAE Systems, and Northrop Grumman to develop event-based infrared camera technologies under the Fast Event-Based Neuromorphic Camera and Electronics (FENCE) system.

Event-based, or neuromorphic, cameras are an emerging class of sensors with demonstrated advantages relative to traditional imagers. The systems operate asynchronously and only transmit information about pixels that have changed, meaning that they produce significantly less data and operate with much lower latency and power.

“Neuromorphic refers to silicon circuits that mimic brain operation — they offer sparse output, low latency, and high energy efficiency,” said Whitney Mason, program manager leading the FENCE program. “Event-based cameras operate under these same principles when dealing with sparse scenes, but currently lack advanced ‘intelligence’ to perform more difficult perception and control tasks.”

Today’s state-of-the-art cameras work well with scenes that have few changes to track and sample imagery. However, the systems have a difficult time with cluttered and dynamic scenes, which limits their use in military applications.

The FENCE project seeks to address those shortcomings by developing and demonstrating a low-latency, low-power, event-based IR focal plane array (FPA) and a new class of digital signal processing and learning algorithms to enable intelligent sensors that can handle more dynamic and complex scenes. The three teams will work to develop an asynchronous read-out integrated circuit (ROIC) with low latency, as well as a processing layer that integrates with the ROIC to identify relevant spatial and temporal signals. Together the ROIC and processing layer will enable an integrated FENCE sensor to operate on less than 1.5 W.

“The goal is to develop a ‘smart’ sensor that can intelligently reduce the amount of information that is transmitted from the camera, narrowing down the data for consideration to only the most relevant pixels,” Mason said.

The technology could aid military applications such as autonomous vehicles, robotics, IR search and tracking, and others. To ensure broad applicability, the researchers will also focus on developing a single solution that is flexible and adaptable so it can be used across various mission spaces.


Photonics Spectra
Sep 2021
GLOSSARY
machine vision
Interpretation of an image of an object or scene through the use of optical noncontact sensing mechanisms for the purpose of obtaining information and/or controlling machines or processes.
artificial intelligence
The ability of a machine to perform certain complex functions normally associated with human intelligence, such as judgment, pattern recognition, understanding, learning, planning and problem solving.
BusinessDARPAdefenseimagingcamerasneuromorphicevent-basedAmericasSensors & Detectorsmachine visionartificial intelligenceAIalgorithmsmilitaryspaceRaytheonNorthrop GrummanBAE Systemslight speed

back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2023 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x Subscribe to Photonics Spectra magazine - FREE!
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.