Skip to content Skip to main navigation Report an accessibility issue

News

Qi Adds Vol Expertise to Earth-wide Monitoring Endeavor

A toolkit for creating a real-time 3D image of the entire Earth should include a few good satellites with high-resolution, remote imaging capabilities. Hollywood might have us believe these are a dime a dozen. For monitoring and recognizing both man-made and natural changes on a global scale, though, it turns out that viable options are limited, and choosing which to use for the best results can be tricky.

Hairong Qi, Gonzalez Family Professor in EECS, has navigated this over the last three years while contributing her expertise in remote sensing techniques to world-mapping projects supported by Intelligence Advanced Research Projects Activity (IARPA).

“My area is computer vision and machine learning,” said Qi. “In my case, remote sensing becomes one of the application areas where I could apply machine learning and computer vision algorithms.”

Due to the diverse biome of the Earth’s surface, the trained model is difficult to generalize as the needed training data for these algorithms are much less holistically representative of the surface. The IARPA projects seek more advanced machine learning and artificial intelligence approaches to meet this data demand.

The projects work together over two contracts. CORE 3D (Creation of Operationally Realistic 3D Environment), from 2017 to 2020, developed a methodology to create an accurate and realistic 3D model for situational awareness essential to military, humanitarian, and intelligence mission planning.

The more recent contract, SMART (Space-Based Machine Automated Recognition Technique), builds on CORE 3D to automate broad-area satellite imagery sources to detect and assess changes on the Earth’s surface—providing timely discovery and robust monitoring of both human-made and natural changes.

Qi’s area of specialty is key for meeting the challenge of capturing such a wholistic picture of changes over time—enabling real-time situational awareness fueled by advanced computer vision and machine learning processes.

“Analyzing the entire Earth surface and providing real-time assessment in just a few hours is unthinkable,” she said. “This would require your models to be light without sacrificing accuracy.”

The other big challenge is the system’s capacity for generalization. A model that recognizes the architecture of North America will not recognize the material and building designs of regions like South America or Asia, or that a model that recognizes the material in summer might not work for the same material in winter. Beyond mapping relatively static architecture, the focus of CORE 3D, the ever-changing nature of a planet-size event evolution added to the challenge of detecting surface changes over time, the goal of the SMART phase.

“We ran into a lot of false positives,” said Qi. “You have a lot of interference from the atmosphere, from clouds, from different weather conditions, and different seasons. All of those cause problems to your detection.”

That’s why multiple types of satellite imagery come into play for cross-checking the data across multiple spectral bands. Each type of imaging device has a different balance of resolution along spectral, spatial, and temporal domains.

“We use three kinds: Landsat, Sentinel, and Worldview,” said Qi. “With Landsat, we have more frequent access to the images: on a daily basis. But because the temporal resolution is very high, then the spatial resolution is very poor. Each pixel is around 30 square meters of spatial resolution. Sentinel, on the other hand, has a spatial resolution of roughly 10 square meters, but we only have a new image every two weeks. Worldview has the best spatial resolution, which is one meter—very accurate. But unfortunately, we have just a handful of those images.”

Project partners—research teams at multiple universities coordinated by international company Accenture—drill down from Landsat’s coarser resolution coverage of the Earth for a quick board area screening to mark potential areas that might undergo changes of interest.

“We come in and specify if this is a true change, and if it indeed is a change, at which phase is the change,” said Qi. “Our processing algorithm has already been integrated into Accenture’s pipeline.”

Qi thanks Paul Montgomery, director for government-industry partnerships in UT’s Office of Research, Innovation, and Economic Development, for reaching out and helping her connect to the project. These kinds of projects have provided her graduate students with great opportunities working with world-renowned experts in computer vision, remote sensing, and machine learning.

“I’m proud of our graduate students: Maofeng Tang, Fanqi Wang, Razieh Baghbaderani, Konstantinos Georgiou, and post-doc researcher, Weisheng Tang,” said Qi. “I have never come across a problem that they cannot solve! They are well trained to handle these real-world problems, and then be able to publish their work at top conferences.”

Projects like these give Engineering Vols a chance to see their contributions in the big picture of international collaboration—a satellite view for the Big Orange experience.