Print this page

Researchers create the first artificial vision system for both land and water

Written by  Friday, 05 August 2022 10:32
Write a comment
Boston MA (SPX) Aug 05, 2022
Giving our hardware sight has empowered a host of applications in self-driving cars, object detection, and crop monitoring. But unlike animals, synthetic vision systems can't simply evolve under natural habitats. Dynamic visual systems that can navigate both land and water, therefore, have yet to power our machines - leading researchers from MIT, the Gwangju Institute of Science and Technology (

Giving our hardware sight has empowered a host of applications in self-driving cars, object detection, and crop monitoring. But unlike animals, synthetic vision systems can't simply evolve under natural habitats. Dynamic visual systems that can navigate both land and water, therefore, have yet to power our machines - leading researchers from MIT, the Gwangju Institute of Science and Technology (GIST), and Seoul National University in Korea to develop a novel artificial vision system that closely replicates the vision of the fiddler crab and is able to tackle both terrains.

The semi-terrestrial species - known affectionately as the calling crab, as it appears to be beckoning with its huge claws - has amphibious imaging ability and an extremely wide field of view, as all current systems are limited to hemispherical. The new artificial eye, resembling a spherical, largely nondescript, small, black ball, makes meaning of its inputs through a mixture of materials that process and understand light.

The scientists combined an array of flat microlenses with a graded refractive index profile, and a flexible photodiode array with comb-shaped patterns, all wrapped on the 3D spherical structure. This configuration meant that light rays from multiple sources would always converge at the same spot on the image sensor, regardless of the refractive index of its surroundings.

A paper on this system, co-authored by Fredo Durand, an MIT professor of electrical engineering and computer science and affiliate of the Computer Science and Artificial Ingelligence Laboratory (CSAIL), and 15 others, appears in the July issue of the journal Nature Electronics.

Both the amphibious and panoramic imaging capabilities were tested in in-air and in-water experiments by imaging five objects with different distances and directions, and the system provided consistent image quality and an almost 360-degree field of view in both terrestrial and aquatic environments. Meaning: It could see both underwater and on land, where previous systems have been limited to a single domain.

There's more than meets the eye when it comes to fiddler crabs. Behind their massive claws exists a powerful, unique vision system that evolved from living both underwater and on land. The creatures' flat corneas, combined with a graded refractive index, counter defocusing effects arising from changes in the external environment - an overwhelming limit for other compound eyes.

The crabs also have a 3D omnidirectional field of view, from an ellipsoidal and stalk-eye structure. They've evolved to look at almost everything at once to avoid attacks on wide-open tidal flats, and to communicate and interact with mates.

To be sure, biomimetic cameras aren't new. In 2013, a wide field of view (FoV) camera that mimicked the compound eyes of an insect was reported in Nature, and in 2020, a wide FoV camera mimicking a fish eye emerged. While these cameras can capture large areas at once, it's structurally difficult to exceed 180 degrees, and more recently, commercial products with 360-degree FoV have come into play.

These can be clunky, though, since they have to merge images taken from two or more cameras, and to enlarge the field of view, you need an optical system with a complex configuration, which causes image distortion. It's also challenging to sustain focusing capability when the surrounding environment changes, such as in air and underwater - hence the impetus to look to the calling crab.

The crab proved a worthy muse. During tests, five cutesy objects (dolphin, airplane, submarine, fish, and ship), at different distances were projected onto the artificial vision system from different angles. The team performed multi-laser spot imagining experiments, and the artificial images matched the simulation. To go deep, they immersed the device halfway in water in a container.

A logical extension of the work includes looking at biologically inspired light-adaptation schemes in the quest for higher resolution and superior image-processing techniques.

"This is a spectacular piece of optical engineering and non-planar imaging, combining aspects of bio-inspired design and advanced flexible electronics to achieve unique capabilities unavailable in conventional cameras," says John A. Rogers, the Louis Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering, and Neurological Surgery at Northwestern University, who was not involved in the work. "Potential uses span from population surveillance to environmental monitoring."

This research was supported by the Institute for Basic Science, the National Research Foundation of Korea, and the GIST-MIT Research Collaboration grant funded by the GIST in 2022.

Research Report:"An amphibious artificial vision system with a panoramic visual field"


Related Links
Computer Science and Artificial Intelligence Laboratory (CSAIL)
All about the robots on Earth and beyond!

Tweet

Thanks for being here;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.

SpaceDaily Contributor
$5 Billed Once

credit card or paypal

SpaceDaily Monthly Supporter
$5 Billed Monthly

paypal only



ROBO SPACE
NASA Space Robotics dive into deep-sea work
Greenbelt MD (SPX) Aug 04, 2022
What's the difference between deep space and the deep sea? For a robot, the answer is: not much. Both environments are harsh and demanding, and, more importantly, both are far removed from the machine's operator. That's why a team of roboticists from NASA's Johnson Space Center in Houston decided to apply their expertise to designing a shape-changing submersible robot that will cut costs for maritime industries. "What NASA taught us is to put together robust software autonomy with a capable ... read more


Read more from original source...

You must login to post a comment.
Loading comment... The comment will be refreshed after 00:00.

Be the first to comment.