[[{“value”:”
For years, satellite sensors have transmitted vast quantities of imagery to the ground, where analysts then discarded views obscured by clouds, identified natural features or objects and shared pictures with customers. The entire process took days.
Recently, companies have cut the timeline to hours — or even less — through frequent contact with ground stations, intersatellite communications links and automated analysis. And now NASA experts are eyeing a reality in which satellites could spot an erupting volcano or the convective core of a powerful storm system and share its location in minutes.
NASA’s Jet Propulsion Laboratory took an important step toward that goal in mid-July by demonstrating Dynamic Targeting, a research experiment in automated Earth observation. For the first time, a commercial cubesat looked ahead along its orbital path, analyzed the imagery and determined where to point its instrument without human intervention.
“It’s a big deal because a lot of things have to come together to make it work,” Steve Chien, JPL principal investigator for Dynamic Targeting, said in an interview. “The difference between 10 years ago and now is that we now can do all of the pieces reliably and fast.”
Speedy insights
Researchers at JPL have worked on Dynamic Targeting for more than a decade because it promises smarter satellites. Instead of gathering pictures of everything they pass over or carrying out a series of assigned tasks, researchers want satellites to respond to their surroundings like a person.
“If you’re sitting in a car looking out the window and you see something interesting, you stare at it,” Chien said. “That’s exactly what we’re trying to do with Dynamic Targeting.”
Making satellites operate that way isn’t easy.
The Dynamic Targeting experiment pairs a briefcase-size cubesat from U.K. startup Open Cosmos with JPL’s machine-learning algorithm and a commercial AI processor from Irish startup Ubotica Technologies.
“The focus is very much on latency, using AI to extract value from images directly onboard and get that insight down to the end user as quickly as possible,” Aubrey Dunne, Ubotica co-founder and chief technology officer, said at the SmallSat 2025 Conference in Salt Lake City in August.
In ongoing tests, Open Cosmos’ CogniSat-6 turns its hyperspectral sensor forward to scan the horizon for clouds. The onboard processor analyzes the imagery and then tells the satellite how to turn the sensor to acquire the best cloud-free images. The whole operation takes less than 90 seconds.
Seconds count
Mars rovers perform similar feats, surveying their surroundings and relying on JPL algorithms to pick the best targets. That’s harder to do in a 500-kilometer low Earth orbit where a satellite races over the ground at about 7.5 kilometers a second. The Dynamic Targeting algorithm, trained to spot clouds, must also take into account Earth’s rotation and curvature.
“Then you have about 50 seconds to interpret the data and figure out if you’re going to take an image, how you’re going to point your instrument,” Chien said.
JPL’s initial concept for Dynamic Targeting called for a satellite with two sensors: one to look ahead and another pointing down. But researchers couldn’t find a satellite with a dedicated look-ahead sensor. So, the CogniSat-6 cubesat points its camera ahead to look for clouds then turns it down and left or right to get the best view.
“That’s not ideal, but we’re just trying to demonstrate the technology,” Chien said.
In the future, two or more satellites could share the location of targets of interest. For instance, a satellite with a wide-field-of-view sensor that detects smoke could send instructions to another spacecraft to gather imagery with a higher-resolution instrument.
Beyond clouds
Clouds were the initial target for JPL’s Dynamic Targeting algorithm because they obscure roughly two-thirds of the surface at Earth’s mid-latitudes, making them a significant problem for optical sensors. Human analysis conducted after the recent tests confirmed that the CogniSat-6 sensor succeeded in gathering cloud-free imagery.
“CogniSat-6 is the first of a whole bunch of missions that we hope would use this technology,” Chien said. “Commercial entities are very interested in this. And a number of NASA science mission concepts could benefit from an agile instrument that is actively choosing targets.”
Dunne calls Dynamic Targeting “a paradigm shift in efficiency.” With traditional Earth-observation sensors, a small fraction of the pixels acquired and downlinked prove useful.
“This is about trying to get more valuable data down,” he said in an interview.
For severe storms and volcanic eruptions, look-ahead sensors could detect them. The next challenge is rapidly sharing the information with people on the ground.
It’s a problem industry is racing to solve.
“The beautiful thing about people wanting internet access from space is that companies are providing more downlink capabilities, both in terms of data volume and reducing the latency,” Chien said. “You just have to get your data to their network.”
The short answer is: satellite autonomy meant different things to different people.
Widespread consensus at the SmallSat 2025 Conference was that satellites should be equipped to handle routine maintenance and communications without help from ground controllers.
“There’s no reason to do a normal care and feeding contact in a nonautomatic fashion,” said Col. Owen Stephens, contracting director at the U.S. Space Force’s Space Rapid Capabilities Office. “You shouldn’t need a human to do that. In fact, the machine will do it better than a human could, because humans will screw up their command entry.”
Many speakers also called for sensors paired with machine-learning algorithms running on edge processors to handle complex tasks like rendezvous and proximity operations or automated collision avoidance.
“In a situation like that, autonomy could be very valuable,” said Benjamin Bahney, Lawrence Livermore National Laboratory space program leader.
Similarly, remote-sensing operations near the moon or Mars will require extensive autonomy because of limited communications bandwidth.
“Communicating in those regimes is very challenging so there’s tremendous benefit if you can operate autonomously,” Bahney said.
This article first appeared in the September 2025 issue of SpaceNews Magazine with the title “When images are available in minutes.”
Related
“}]]
Source: Read More