Through the eyes of a cat – biomimicry of feline eyes may revolutionize robotic vision

by · News-Medical

Inspired by the remarkable vision of cats, researchers have created a new artificial vision system that allows robots to detect and track objects even in challenging environments, breaking new ground in robotics and autonomous systems.

In a recent study published in the journal Science Advances, researchers leveraged crucial aspects of feline eyes, particularly their tapetum lucidum and vertically elongated pupils (VP), to develop a monocular artificial vision system capable of hardware-level object detection, recognition, and camouflage-breaking. While software-aided implementations of object recognition and tracking have been attempted, they require substantial energy and computation requirements, necessitating hardware-level innovations.

Cats have a tapetum lucidum, a reflective layer behind the retina that enhances their ability to see in low light, enabling them to hunt at dawn and dusk.

The present vision system uses a custom slit-like elliptical aperture (inspired by the asymmetric depth of field of cats’ VPs) to augment object focus and allow for an asymmetric depth of field, improving contrast between the target object and its background. An additional tapetum lucidum-inspired silicon photodiode array with patterned metal reflectors enhances low-light vision. Together, these advancements open the doors to a new generation of mobile robots that can detect, recognize, and track targets with significantly improved accuracy, even in dynamically changing environments with variable lighting conditions.

Background

The 21st century has witnessed unprecedented advancements in robotics and automation, resulting in the gradual influx of robotics across scientific, medical, industrial, and military applications. While software-based machine learning (ML) and artificial intelligence (AI) deployments have revolutionized robotic automation, hardware-level progress remains shackled by the limitations of conventional design and fabrication decisions.

An ideal example of the above is vision-based operation strategies. Conventional image-capturing devices (e.g., cameras) were optimized to record image data (e.g., light intensity, color, and object shape) but required user input to adjust aperture size and exposure duration to target objects focused under dynamically changing lighting. Modern robotics applications, particularly those used for surveillance, cannot be content with passive image data acquisition. Instead, they need to extract and analyze real-time image data and use this information to guide their subsequent motion.

Software-based computer vision technologies, including high dynamic range (HDR), binocular vision-based camouflage-breaking, and AI-assisted post-processing, have partially addressed the hardware limitations of today’s robotics implementations. Unfortunately, these technologies require substantial computational and energetic (power/electricity) investment, increasing the size and running costs of resulting robotic systems. It is thus imperative for the future of robot automation that hardware capable of unassisted object identification, camouflage-breaking, and optimized performance under a wide range of lighting conditions is developed.

About the study

In the present study, researchers developed and tested an artificial vision system that mimics the feline eye. The system comprises two main components: a custom-made optical lens capable of varying apertures between elliptical, small shape, and full-opening circular and a novel hemispherical silicon photodiode array with patterned metal (silver) reflectors (HPA-AgR).

The photodiode array was fabricated by spin coating a silicon dioxide (SiO₂) wafer with a polyamic acid solution containing an ultrathin polyimide (PI) layer upon which a patterned reflector was superimposed using the wet-etching technique (100 nm Ag). The structured reflectors were designed to simulate the light-reflecting properties of the tapetum lucidum, enhancing light absorption under dim lighting conditions. The performance of the resulting photodiode was measured using a wide temperature (3100 K) halogen lamp, a probe station (image sensor array + semiconductor device analyzer), and a data acquisition (DAQ) board.

Monte Carlo-based ray tracing was used to evaluate the camouflage-breaking performance of the feline-inspired vision system versus conventional optical systems (circular pupil [CP]) against a variable lighting of 0-500 lumens.

Study findings

While monocular CP systems (including human eyes) struggle to differentiate between the target object and its background (pixel saturation) in extremely bright scenarios, the asymmetrical aperture design of feline eyes (feline VP) and, by extension, the current vision system can adjust focus between different (tangential and sagittal) planes thereby substantially offsetting light intensity and enabling camouflage breaking.

Potential applications include unmanned vehicles, surveillance robots, and military drones, where the ability to detect and track targets under variable lighting could greatly improve operational success in dynamic and unpredictable environments.

The design also allows for improved focus on objects at different distances, further reducing optical noise from background elements. Comparisons between the current VP-inspired and conventional CP-like systems highlight the latter's lack of camouflage-breaking, especially in bright-light conditions. This is predominantly due to observed astigmatism between tangential and sagittal planes, which blurs the target and its background. In contrast, the VP system could easily distinguish between the target and the object irrespective of ambient light intensity. Furthermore, while ‘locked on’ a target, the vision system’s design blurs out the target’s background, reducing the amount of uninformative noise and thereby lowering the computational burden required for real-time analysis.

Similarly, while monocular CP systems achieve high camouflage-breaking performance in low-light conditions (wide-open pupils), they often suffer from low photocurrent in dark scenarios. Feline (and the current artificial) optics circumvent this limitation by not only fully dilating their VPs but also using their tapetum lucidum (or, in the artificial case, their metal reflectors) to reflect ambient light onto the pupil, further enhancing low-light target acquisition. Notably, comparisons between conventional CP optics and the current VP-inspired ones revealed that the novel system is 52-58% more efficient at photoabsorption than traditional technologies.

Despite these advances, the researchers noted one primary limitation of their system: its narrow field of view (FoV). Innovations in optic system movement (possibly inspired by the movements of cat heads) will be needed before these systems can be integrated into autonomous robotics.

Conclusions

The present study reports the development and validation of a novel, feline eye-inspired vision system. The system consists of a variable aperture lens and a metallic silicon photodiode array to achieve unprecedented, hardware-level object tracking and camouflage-breaking irrespective of the intensity of ambient light. While this vision system suffers from a low FoV, advancements in robotic movement may allow for its integration into autonomous robotics, allowing for a new generation of unmanned surveillance and tracking systems.

Journal reference: