With the use of 2D cameras and advanced space robotics algorithms, aerospace engineers have developed a navigation system that can oversee numerous satellites using only visual data. This innovative system was recently tested in space for the first time.
In the future, instead of relying on costly and large standalone satellites, groups of smaller satellites, often called a “swarm” by researchers, will collaborate to provide enhanced precision, agility, and independence in operations. Among the teams working towards this vision are scientists at Stanford University’s Space Rendezvous Lab, who have just completed the inaugural in-orbit trial of a prototype that can guide a swarm of satellites using exclusively visual data obtained through a wireless network.
“This is a groundbreaking achievement and represents the culmination of 11 years of work in my lab, which was established with the mission to advance the current capabilities in autonomous systems in space,” stated Simone D’Amico, an associate professor of aeronautics and astronautics and the lead author of the study. “Starling is the first ever demonstration of a fully autonomous swarm of satellites.”
This experiment, referred to as the Starling Formation-Flying Optical Experiment (StarFOX), successfully maneuvered four small satellites working together, relying solely on visual data captured by onboard cameras to determine their flight paths. Research findings from the initial StarFOX test were shared with experts at the Small Satellite Conference in Logan, Utah.
All the aspects
D’Amico emphasized that the challenge has fueled his team’s endeavor for over a decade. “Since the lab’s beginning, we have promoted distributed space systems; now it’s widely recognized in the field. NASA, the Department of Defense, the U.S. Space Force—everyone has grasped the benefits of coordinating multiple assets to achieve goals that would be unfeasible or extremely challenging with a single spacecraft,” he explained. “The benefits include enhanced precision, broader coverage, flexibility, resilience, and the potential to explore new objectives yet to be imagined.”
A robust navigation system for the swarm poses significant technological hurdles. Current navigation methods depend on the Global Navigation Satellite System (GNSS), which necessitates frequent communication with systems on Earth. In deep space, the Deep Space Network exists, but it is relatively sluggish and difficult to adapt for future projects. Additionally, neither option assists satellites in avoiding “non-cooperative objects,” such as space debris that could damage their operations, as D’Amico pointed out.
The swarm requires a self-sufficient navigation system that permits a high level of autonomy and resilience. D’Amico noted that these systems are made more appealing by the minimal technical demands and reduced costs of today’s compact cameras and hardware. The cameras utilized in the StarFOX test are cost-effective, reliable 2D cameras known as star-trackers, commonly found on contemporary satellites.
“Essentially, navigation based on angles only requires no additional hardware, even when applied to small, inexpensive spacecraft,” D’Amico remarked. “Furthermore, sharing visual information among swarm members introduces a novel distributed optical navigation capability.”
Written in the stars
StarFOX operates by using visual measurements from individual cameras attached to each satellite in the swarm. Much like ancient mariners who used sextants to navigate the seas, this method involves using familiar stars in the background as references to calculate bearing angles towards the satellites. These angles are subsequently processed on board using precise physics-based force models to estimate the satellites’ positions and velocities in relation to a planet—Earth in this scenario, although it could extend to the Moon, Mars, or other celestial bodies.
This system utilizes the Space Rendezvous Lab’s angles-only Absolute and Relative Trajectory Measurement System (shortened to ARTMS), which incorporates three novel space robotics algorithms. One algorithm focuses on image processing to identify and follow multiple targets in the images, calculating their bearing angles—the angles indicating how the objects are approaching or moving away from each other. The Batch Orbit Determination algorithm then computes a rough orbit for each satellite based on those angles. Lastly, the Sequential Orbit Determination algorithm enhances the swarm’s trajectories by processing new images over time, potentially feeding autonomous navigation, control, and collision avoidance systems on board.
Communication among satellites occurs via an inter-satellite link (wireless network), allowing for the calculation of both absolute and relative positions and velocities with remarkable accuracy without dependence on GNSS. Under very challenging conditions, using just a single observer satellite, StarFOX managed to determine relative positions (the locations of each satellite in relation to one another) within 0.5% of their distance. With the inclusion of more observer satellites, the error rate impressively decreased to just 0.1%.
Due to the encouraging outcomes of the Starling test, NASA has expanded the initiative, now dubbed StarFOX+, through 2025, to further investigate these enhanced capabilities and pave the way for future technologies in space awareness and positioning.