Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Bioengineers Aim for ‘Visual Cortex on Silicon’

A multi-institutional group aims to create a machine vision system that approaches the cognitive abilities of the human brain. Such a system would enable computers to not only record images, but also to understand visual content at up to a thousand times the efficiency of current technologies.

The human vision system understands and interprets complex scenes for a wide range of visual tasks in real time while consuming less than 20 W of power. Smart machine vision systems that understand and interact with their environments could have a profound impact on society, including applications in aids for the visually impaired, driver assistance in automobiles and augmented reality systems.


The human visual cortex will inspire the design of smart cameras that can aid visually impaired, support augmented reality and assist drivers. The team draws from eight institutions and experts in multiple fields of computing. Courtesy of Vijaykrishnan Narayanan, Pennsylvania State University.


While several machine vision systems today can successfully perform one or a few human tasks — such as detecting human faces in point-and-shoot cameras — they are still limited in their ability to perform a wide range of visual tasks, operate in complex, cluttered environments, and provide reasoning for their decisions. In contrast, the visual cortex in mammals excels in a broad variety of goal-oriented cognitive tasks, and is at least three orders of magnitude more energy-efficient than customized state-of-the-art machine vision systems.

The five-year, $10 million National Science Foundation (NSF) project Visual Cortex on Silicon is one of two awards announced by the NSF and funded through its Expeditions in Computing program.


Bioengineering professor Gert Cauwenberghs’ laboratory developed a neuromorphic circuit array that models computation and communication across large-scale networks in the visual cortex in order to understand how the brain receives and processes visual information. Each chip in the array mimics the activity of 65,000 neurons that make 65 million synaptic connections in memory. As a result, the research team can change these connections by changing entries in the memory tables, allowing great detail and flexibility in studying the circuit dynamics of cortical vision. Courtesy of UCSD Jacobs School of Engineering.


Vijaykrishnan Narayanan, professor of computer science and engineering and electrical engineering at Pennsylvania State University, is the project’s lead principal investigator. Collaborating institutions include the University of California, San Diego (UCSD); the University of Southern California (USC); Stanford University; York College of Pennsylvania; UCLA; the University of Pittsburgh; and MIT.

“We have already been collaborating with colleagues at USC and MIT in developing smart camera systems for the past five years and demonstrated vision systems that operate with two to three orders of better energy efficiency than existing approaches,” Narayanan said. “With this expedition, we are aiming to leapfrog the intelligence of these vision systems to approach human cognitive capabilities, while being extremely energy-efficient and user friendly.”
<br
 
The expedition seeks to understand the fundamental mechanisms used in the visual cortex, with the hope of enabling the design of new vision algorithms and hardware fabrics that can improve power, speed, flexibility and recognition accuracies relative to existing machine vision systems. The interdisciplinary effort covers several domains, including neuroscience, computer vision, hardware design, new device technology, human-computer interface, data analytics and privacy.


A micrograph of a computer chip designed in the laboratory of professor Gert Cauwenberghs that emulates how the brain processes visual information. Courtesy of UCSD Jacobs School of Engineering.


Bioengineering professor Gert Cauwenberghs of the UCSD Jacobs School of Engineering said the project offers a unique collaborative opportunity with global experts in neuroscience, computer science, nanoengineering and physics. Cauwenberghs and his team are currently developing computer chips that emulate how the brain processes visual information.

“The brain is the gold standard for computing,” Cauwenberghs said, adding that computers work completely differently than the brain, acting as passive processors of information and problems using sequential logic. The human brain, by comparison, processes information by sorting through complex input from the world and extracting knowledge without direction.

By developing chips that can function more like the human brain, Cauwenberghs believes researchers can achieve a number of significant breakthroughs in our understanding of brain function from the work of single neurons all the way up to a more holistic view of the brain as a system. For example, building chips that model different aspects of brain function, such as how the brain processes visual information, gives researchers a more robust tool to understand where problems arise that contribute to disease or neurological disorders.

For more information, visit: www.cse.psu.edu/research/visualcortexonsilicon.expedition  or www.jacobsschool.ucsd.edu 

</br


Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media