New Research Shows Plugged Wells and Decreased Injection Rates Diminish Induced Earthquake Occurrences in Oklahoma

Wastewater injection resulting from oil and gas production in Oklahoma caused a dramatic rise in seismic activity in the state between 2009 and 2015. But regulatory efforts to backfill some injection wells with cement and reduce injection volumes have been effective in lowering the state's induced earthquake rate, according to a new study. Wastewater injection
HomeEnvironmentButterfly Vision: Revolutionary Optical Innovations Transform Camera Technology

Butterfly Vision: Revolutionary Optical Innovations Transform Camera Technology

Butterflies possess a remarkable visual capability that allows them to see more of the spectrum than humans can, including more colors and the polarization of light. This unique vision helps them navigate accurately, gather food, and communicate with each other. Other creatures, such as mantis shrimp, can detect an even broader range of light, including the circular polarization of light waves. They utilize this ability to create a ‘love code’ to find and attract mates.

Motivated by these extraordinary abilities observed in nature, a group of researchers from the Penn State College of Engineering has created a thin optical component called a metasurface, which can be attached to standard cameras to capture both spectral and polarization information through tiny, antenna-like structures that adjust light properties. Alongside this, the team developed a machine learning system that decodes this multi-dimensional visual data in real-time using a typical laptop.

The team shared their findings today (Sept. 4) in Science Advances.

“The animal world demonstrates that aspects of light we cannot see contain valuable information for various applications,” said Xingjie Ni, an associate professor of electrical engineering and the paper’s lead author. “We essentially converted a regular camera into a compact, lightweight hyperspectro-polarimetric camera by integrating our metasurface.”

Contrary to traditional hyperspectral or polarimetric cameras, which are often large and costly and capture just one type of data at a time, the metasurface, measuring only three millimeters on each side and inexpensive to produce, captures both spectral and polarization imaging data simultaneously when placed between the camera lens and sensors.

The images collected need decoding to reveal their spectral and polarization details. Bofeng Liu, a doctoral student in electrical engineering and co-author of the study, developed a machine learning framework trained on 1.8 million images using data augmentation techniques for this purpose.

“Working at 28 frames per second, mainly limited by our camera’s speed, we can quickly recover both spectral and polarization information through our neural network,” Liu noted. “This capability allows for real-time capture and viewing of image data.”

The researchers evaluated their metasurface and neural network by recording video of transparent “PSU” letters under various laser illuminations. They also took images of the striking scarab beetle, which is known for reflecting circularly polarized light that can be seen by its fellow species members.

Having prompt access to hyperspectro-polarimetric information from various objects could enhance consumer experiences if this technology becomes available commercially, Ni stated.

“Imagine taking our camera to the grocery store to snap pictures and evaluate the freshness of fruits and vegetables before purchasing,” Ni explained. “This advanced camera gives us insight into a hidden world.”

Furthermore, in biomedical fields, hyperspectro-polarimetric data could be instrumental in distinguishing the material and structural properties of tissues in the body, potentially assisting in the identification of cancerous cells.

This research expands on Ni’s earlier explorations and creations of various metasurfaces, including one that emulates the human eye’s processing capabilities, and metalenses, such as one designed for imaging distant subjects like the moon.

In addition to Ni and Liu, other co-authors include Zhiwen Liu, co-corresponding author and electrical engineering professor at Penn State, as well as Hyun-Ju Ahn, a postdoctoral researcher, and graduate students Lidan Zhang, Chen Zhou, Yiming Ding, Shengyuan Chang, Yao Duan, Md Tarek Rathman, Tunan Xia, and Xi Chen, all from electrical engineering.

The research was supported by the U.S. National Science Foundation and the National Eye Institute of the U.S. National Institutes of Health.