Maris-Tech confirms customers signing up for Jupiter Drones codec and AI-powered system
Launched at AUSA in October, the company’s multi-stream video codec is attempting to bring a new lease of life to drone technology through its AI accelerator.
Stanford computer scientists have developed an artificial intelligence system that enables robotic helicopters to teach themselves to fly difficult stunts by watching other helicopters perform the same maneuvers.
The result is an autonomous helicopter than can perform a complete airshow of complex tricks on its own.
The stunts are "by far the most difficult aerobatic maneuvers flown by any computer controlled helicopter," said Andrew Ng, the professor directing the research of graduate students Pieter Abbeel, Adam Coates, Timothy Hunter and Morgan Quigley.
The dazzling airshow is an important demonstration of "apprenticeship learning," in which robots learn by observing an expert, rather than by having software engineers peck away at their keyboards in an attempt to write instructions from scratch.
Stanford's artificial intelligence system learned how to fly by "watching" the four-foot-long helicopters flown by expert radio control pilot Garett Oku. "Garett can pick up any helicopter, even ones he's never seen, and go fly amazing aerobatics. So the question for us is always, why can't computers do things like this?" Coates said.
Computers can, it turns out. On a recent morning in an empty field at the edge of campus, Abbeel and Coates sent up one of their helicopters to demonstrate autonomous flight. The aircraft, brightly painted Stanford red, is an off-the-shelf radio control helicopter, with instrumentation added by the researchers.
For five minutes, the chopper, on its own, ran through a dizzying series of stunts beyond the capabilities of a full-scale piloted helicopter and other autonomous remote control helicopters. The artificial-intelligence helicopter performed a smorgasbord of difficult maneuvers: traveling flips, rolls, loops with pirouettes, stall-turns with pirouettes, a knife-edge, an Immelmann, a slapper, an inverted tail slide and a hurricane, described as a "fast backward funnel."
The pièce de résistance may have been the "tic toc," in which the helicopter, while pointed straight up, hovers with a side-to-side motion as if it were the pendulum of an upside down clock.
"I think the range of maneuvers they can do is by far the largest" in the autonomous helicopter field, said Eric Feron, a Georgia Tech aeronautics and astronautics professor who worked on autonomous helicopters while at MIT. "But what's more impressive is the technology that underlies this work. In a way, the machine teaches itself how to do this by watching an expert pilot fly. This is amazing."
Writing software for robotic helicopters is a daunting task, in part because the craft itself, unlike an airplane, is inherently unstable. "The helicopter doesn't want to fly. It always wants to just tip over and crash," said Oku, the pilot.
To scientists, a helicopter in flight is an "unstable system" that comes unglued without constant input. Abbeel compares flying a helicopter to balancing a long pole in the palm of your hand: "If you don't provide feedback, it will crash."
Early on in their research, Abbeel and Coates attempted to write computer code that would specify the commands for the desired trajectory of a helicopter flying a specific maneuver. While this hand-coded approach succeeded with novice-level flips and rolls, it flopped with the complex tic-toc."
It might seem that an autonomous helicopter could fly stunts by simply replaying the exact finger movements of an expert pilot using the joy sticks on the helicopter's remote controller. That approach, however, is doomed to failure because of uncontrollable variables such as gusting winds.
When the Stanford researchers decided their autonomous helicopter should be capable of flying airshow stunts, they realized that even defining their goal was difficult. What's the formal specification for "flying well?" The answer, it turned out, was that "flying well" is whatever an expert radio control pilot does at an airshow.
So the researchers had Oku and other pilots fly entire airshow routines while every movement of the helicopter was recorded. As Oku repeated a maneuver several times, the trajectory of the helicopter inevitably varied slightly with each flight. But the learning algorithms created by Ng's team were able to discern the ideal trajectory the pilot was seeking. Thus the autonomous helicopter learned to fly the routine better—and more consistently—than Oku himself.
During a flight, some of the necessary instrumentation is mounted on the helicopter, some on the ground. Together, they continuously monitor the position, direction, orientation, velocity, acceleration and spin of the helicopter in several dimensions. A ground-based computer crunches the data, makes quick calculations and beams new flight directions to the helicopter via radio 20 times per second.
The helicopter carries accelerometers, gyroscopes and magnetometers, the latter of which use the Earth's magnetic field to figure out which way the helicopter is pointed. The exact location of the craft is tracked either by a GPS receiver on the helicopter or by cameras on the ground. (With a larger helicopter, the entire navigation package could be airborne.)
There is interest in using autonomous helicopters to search for land mines in war-torn areas or to map out the hot spots of California wildfires in real time, allowing firefighters to quickly move toward or away from them. Firefighters now must often act on information that is several hours old, Abbeel said.
"In order for us to trust helicopters in these sort of mission-critical applications, it's important that we have very robust, very reliable helicopter controllers that can fly maybe as well as the best human pilots in the world can," Ng said. Stanford's autonomous helicopters have taken a large step in that direction, he said.
Launched at AUSA in October, the company’s multi-stream video codec is attempting to bring a new lease of life to drone technology through its AI accelerator.
Quantum-Systems has been upgrading its UAS family, with new versions of the Vector, Reliant and Twister drones set for release throughout 2025.
The service has been using a Directed Requirement (DR) approach to speed up the deployment of a Medium Range Reconnaissance capability.
AeroVironment’s portfolio will grow thanks to the eVTOL P550 aimed at battalion-level tactical forces.
The Royal Australian Air Force is advancing its unmanned aerial vehicle (UAV) capabilities across three key programmes as it works with the likes of Boeing and Northrop Grumman to reshape Australia’s defence strategy.
Prototypes from Griffon Aerospace and Textron Systems recently passed through MOSA conformance trials and flight tests.