Vision sensors that can mimic human eye is game changer for future autonomous vehicles, industries

Trinity Audio

Share this story!

Read Time:2 Minute, 52 Second

HONG KONG

Future autonomous vehicles and industrial cameras might have human-like vision as the researchers at The Hong Kong Polytechnic University (PolyU) and Yonsei University in Seoul have developed vision sensors that can emulate and even surpass the human retina’s ability to adapt to various lighting levels.

The bioinspired vision sensors developed by Dr Chai’s team can adapt to varying brightness with an effective range of up to 199 dB. The human retina can adapt to environments under sunlight to starlight, with a range of about 160 dB.

Machine vision systems are cameras and computers that capture and process images for tasks such as facial recognition.

These cameras and computers need to be able to “see” objects in a wide range of lighting conditions, which demands intricate circuitry and complex algorithms. Such systems are rarely efficient enough to process a large volume of visual information in real time—unlike the human brain.

The new bioinspired sensors may offer a solution through directly adapting different light intensities by the sensors, instead of relying on backend computation.

The human eye adapts to different levels of illumination, from very dark to very bright and vice versa that allows identification of objects accurately under a range of lighting conditions. The new sensors are designed to mimic this adaptability.

“The human pupil may help adjust the amount of light entering the eye,” explains Dr Chai.

“But the main adaptation to brightness is performed by retina cells.”

Natural light intensity spans a large range, 280 dB. Impressively, the new sensors have an effective range of up to 199 dB, compared with only 70 dB for conventional silicon-based sensors. The human retina can adapt to environments under sunlight to starlight, with a range of about 160 dB.

To achieve this, the research team developed light detectors, called phototransistors, using a dual layer of atomic-level ultrathin molybdenum disulphide, a semiconductor with unique electrical and optical properties.

The researchers then introduced “charge trap states”—impurities or imperfections in a solid’s crystalline structure that restrict the movement of charge—to the dual layer.

“These trap states enable the storage of light information,” the researchers said.

“And dynamically modulate the optoelectronic properties of the device at the pixel level.”

By controlling the movement of electrons, the trap states enabled the researchers to precisely adjust the amount of electricity conducted by the phototransistors. This in turn allowed them to control the device’s photosensitivity, or its ability to detect light.

Each of the new vision sensors is made up of arrays of such phototransistors mimicking the rod and cone cells of the human eye, which are respectively responsible for detecting dim and bright light.

As a result, the sensors can detect objects in differently lit environments as well as switch between, and adapt to, varying levels of brightness—with an even greater range than the human eye.

Get The News Right In Your Inbox …

“The sensors reduce hardware complexity and greatly increase the image contrast under different lighting conditions,” said Dr Chai, “thus delivering high image recognition efficiency.”

These novel bioinspired sensors could usher in the next generation of artificial-vision systems used in autonomous vehicles and manufacturing, as well as finding exciting new applications in edge computing and the Internet of Things. (Robert Greene/Media OutReach)

The research was published in Nature Electronics.

Want to comment? Express your thoughts in the Comment Section below.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Share this story!

Previous post 4 hurt in 3 separate ambush in Maguindanao
Next post Security camera shows man leaves bag before one of two explosions occured in Basilan