The US team to find, identify and follow targets

Two US companies have partned to demonstrate a new type of drone intelligence collection: numerous many people working together with different types of sensor to find, follow and follow a target with minimal human supervision.

What makes this impressive is that this is not a laboratory experiment with specially built equipment but already employed in service with US forces. Effective, capable software it requires minimal processing power. The ability is just an update away – and will become more powerful as young drones and sensors are added to the figure.

“Next to an exciting milestone on the way for great things,” Matt Vogt, the leading official of Palladyne, told me.

Hardware meets the software

The demonstration included the pilot software he from Palladyne he running in numerous Teal drones by Red Cat performing what manufacturers of the term “multi-platform, data with many real-time sensors”. What means this is the transfer of a large amount of workload from the operator to the cars, a necessity when a person is operating multiple drones.

“Absolutely absolutely necessary to reduce cognitive load in a single operator using a six -inch screen with nine drones,” says Red Cat’s Geoff Hitchcock. “It has to be added with him.”

The demonstration included drones working together to cooperatively find and follow a vehicle on Earth. They continued to trace it even when it disappeared from the camera view, following sensors that reveal its radio signal.

This feat got development on some fronts.

One is the “very sensor” aspect. Drones have different types of sensor, which can include video cameras, thermal images, radio and radar frequency detectors. The software sews these together, so give a higher loyal look than any single sensor can provide. “Data Fusion” means combining data so that, for example, thermal and visual data are combined to provide a positive identification of a specific type of vehicle.

“Multi-platform” means that software is operating with multiple drones at the same time, so data from all of them combine. Pilot he can interface with the autopilot of a drone to maneuver it in position and ensure that he keeps the target view, or take the eye on a target polluted by another drone. Numerous video images from different angles will also make the identification more reliable.

The human operator then does not fly individual drones or does not see numerous resources to find objects of interest. They see data fused by all drone sensors, and software is capable of recognizing and highlighting specific objects. For example, the system can tell the operator that there is a truck with a heavy moving machine and tell them. The role of the operator is to decide how to act on that information.

“The magic of the software is that it autonomously enables improved awareness of the situation,” Vogt explains.

Communication with negotiations

Another key to pilot it is to use the smallest band width. During standard operations, each drone sends video broadcasting back to the operator. In Iraq and Afghanistan, it resulted in tens of thousands of high -resolution images to archive without ever being examined, as there was simply much to treat human analysts.

In the pilot he, the drones pass only the required data, a refined technique over an extended period.

“Our CTO is really passionate about our pilot software,” says Vogt. “He has worked in algorithms for many years and they are based on reinforcement learning, melting the sensor and the theory of the game. If a drone needs information to help him solve the problem, he reaches others and requires them to send information”

This allows each of the drones to get the data it needs to create a full view, without flooding the air waves with excess or nonsense data, such as video streams of empty landscapes.

Vogt says the communication process has integrated reinforcement learning, so it gradually becomes more efficient as the system learns to handle each environment and adapts to conditions such as weather and terrain.

Minimal

The demonstration was carried out in the second generation of the second generation of Red Cat. What makes it really impressive is that no additional equipment was included. The pilot was developed on the existing drone device without interfering with other functions. This is because software is optimized to execute minimum processing power

“The requirement for the board calculation is very easy compared to others due to the language format,” says Hitchcock. “It takes about a thousand times less computing power than comparable systems.”

Most importantly, pilot it is a agnostic platform, which means it should be able to run almost every drone. Palladyne is now working to run it to the latest generation of Red Cat, the Arachnid family, and especially The Black Widow Scout Drone. However, it can work on many other platforms, which can include medium and large detection drones and even harvesters equipped with radar and other larger platforms. The system may also include other assets such as drone tracking cameras, which observe roads in Ukraine. Treating the most difficult challenge first – running with small drones – developers should have made the rest of the process easy.

This would enable a common view, in which many drones find, localize, identify and trace many objectives at the same time in real time, giving commanders an unmatched view of the battlefield.

Both Hitchcock and Vogt were operators of US military drones in previous life, working with much more basic systems. In those days the drones were dumb, a little more than radio -controlled planes. Hitchcock remembers the launch of the indicator drones, which were literally held along with giant rubber bands, in the back of a moving truck in the early 2000s, and the contrast with today’s drones.

“Ridy funny how the smart technology is getting now,” says Hitchcock. There are things I never thought I would see in my life. “

The drones and software that enable them to be much smarter everyone exists. The next stage is the union of pieces and making the military for the current state of art so that they decide how to make use better.

“We will finish integration with Widow Black endlessly,” says Vogts. “After that we will be able to go out on the road and demonstrate the full initial ability for our customers”

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top