Relative State Estimation using Event-Based Propeller Sensing

TL;DR

Relative state estimation using event-based propeller sensing with error under 3%.

cs.RO 🔴 Advanced 2026-04-20 29 views
Ravi Kumar Thakur Luis Granados Segura Jan Klivan Radim Špetlík Tobiáš Vinklárek Matouš Vrba Martin Saska
Event Camera UAV Relative Localization Frequency Estimation Multi-Robot Systems

Key Findings

Methodology

This paper proposes a framework using event cameras for quadrotor relative state estimation. The method involves detecting propellers in the event stream to extract regions of interest, processing these regions to estimate per-propeller frequencies. These frequency measurements drive a kinematic state estimation module as thrust input, while camera-derived position measurements provide the update step. Additionally, geometric primitives derived from event streams are used to estimate the quadrotor's orientation by fitting an ellipse over a propeller and backprojecting it to recover the body-frame tilt-axis.

Key Results

  • On a test dataset of five real-world outdoor flight sequences, our approach estimates propeller frequency with an error under 3%.
  • Compared to traditional frame-based camera methods, event cameras offer significant advantages in dynamic range and temporal resolution, especially in visually challenging environments.
  • By utilizing event cameras for decentralized relative localization, our method achieves more efficient swarm flight in multi-robot systems.

Significance

This research is significant for relative state estimation in UAV swarm flight. Traditional frame-based camera methods often perform poorly in dynamic environments, suffering from scale ambiguity and visual challenges. Event cameras offer the advantage of low latency and high dynamic range, excelling in visually challenging conditions. This method provides a decentralized approach to relative localization in multi-robot systems, reducing reliance on communication and enhancing system robustness.

Technical Contribution

The technical contribution of this paper lies in applying event cameras to quadrotor relative state estimation, proposing an innovative method combining propeller frequency and geometric primitives. Unlike existing methods, our approach directly links propeller frequencies to a physical dynamics model for state estimation, offering new engineering possibilities.

Novelty

This study is the first to apply event cameras for quadrotor relative state estimation, particularly in detecting and utilizing propeller frequencies. Unlike previous methods based on simulated flight sequences, our approach is validated in real-world flight sequences.

Limitations

  • During fast lateral motion, propellers may move out of the camera's field of view, leading to detection failures.
  • Event cameras are sensitive to background noise, which may affect the accuracy of frequency estimation.
  • In complex environments, the system's real-time performance may be limited.

Future Work

Future research directions include exploring the application of event vision in multi-UAV scenarios, extracting features based on periodic events and geometric primitives to estimate relative states.

AI Executive Summary

Accurate and fast relative state estimation is crucial in UAV swarm flight. However, traditional monocular frame-based camera methods often perform poorly in dynamic environments, suffering from scale ambiguity and visual challenges. The advent of event cameras offers new possibilities for addressing these issues. This paper proposes a framework using event cameras for quadrotor relative state estimation. The method involves detecting propellers in the event stream to extract regions of interest, processing these regions to estimate per-propeller frequencies. These frequency measurements drive a kinematic state estimation module as thrust input, while camera-derived position measurements provide the update step. Additionally, geometric primitives derived from event streams are used to estimate the quadrotor's orientation by fitting an ellipse over a propeller and backprojecting it to recover the body-frame tilt-axis. Experimental results show that on a test dataset of five real-world outdoor flight sequences, our approach estimates propeller frequency with an error under 3%. Compared to traditional frame-based camera methods, event cameras offer significant advantages in dynamic range and temporal resolution, especially in visually challenging environments. By utilizing event cameras for decentralized relative localization, our method achieves more efficient swarm flight in multi-robot systems. This research provides a new approach to relative state estimation in UAV swarm flight, reducing reliance on communication and enhancing system robustness. Future research directions include exploring the application of event vision in multi-UAV scenarios, extracting features based on periodic events and geometric primitives to estimate relative states.

Deep Analysis

Background

UAV swarm flight has significant applications in fields such as humanitarian assistance, disaster relief, and space exploration. To achieve these applications, UAVs need to accurately estimate each other's states for coordinated flight and collision avoidance. Traditional methods rely on frame-based RGB cameras, but these cameras often perform poorly in dynamic environments, with low temporal resolution, high latency, and low dynamic range. The advent of event cameras offers new possibilities for addressing these issues. Event cameras are low-powered sensors that detect changes in brightness in the scene at microsecond-level temporal resolution, generating asynchronous event streams. Due to these advantages, event cameras are widely used in autonomous navigation tasks with dynamic obstacles.

Core Problem

UAV swarm flight requires accurate and fast relative state estimation for coordinated flight and collision avoidance. Traditional frame-based camera methods often perform poorly in dynamic environments, suffering from scale ambiguity and visual challenges. The advent of event cameras offers new possibilities for addressing these issues. Event cameras detect changes in brightness in the scene at microsecond-level temporal resolution, generating asynchronous event streams.

Innovation

This paper proposes a framework using event cameras for quadrotor relative state estimation. • Detect propellers in the event stream to extract regions of interest, processing these regions to estimate per-propeller frequencies. • Use geometric primitives derived from event streams to estimate the quadrotor's orientation by fitting an ellipse over a propeller and backprojecting it to recover the body-frame tilt-axis. • An innovative method combining propeller frequency and geometric primitives, directly linking propeller frequencies to a physical dynamics model for state estimation, unlike existing methods.

Methodology

This paper proposes a framework using event cameras for quadrotor relative state estimation. • Detect propellers in the event stream to extract regions of interest, processing these regions to estimate per-propeller frequencies. • Use geometric primitives derived from event streams to estimate the quadrotor's orientation by fitting an ellipse over a propeller and backprojecting it to recover the body-frame tilt-axis. • An innovative method combining propeller frequency and geometric primitives, directly linking propeller frequencies to a physical dynamics model for state estimation, unlike existing methods.

Experiments

The experiments were conducted outdoors using two quadrotors in a leader-follower formation. The follower was equipped with a downward-facing event camera to capture the target UAV's propellers. A total of six flight sequences were recorded in the outdoor environment. The observer moved at nearly fixed altitude while the target performed various maneuvers beneath it. In some trials, the target performed aggressive lateral motions. In all flight sequences, the target's four propellers were visible, except for a few instances where they went out of the frame.

Results

On a test dataset of five real-world outdoor flight sequences, our approach estimates propeller frequency with an error under 3%. Compared to traditional frame-based camera methods, event cameras offer significant advantages in dynamic range and temporal resolution, especially in visually challenging environments. By utilizing event cameras for decentralized relative localization, our method achieves more efficient swarm flight in multi-robot systems.

Applications

This method has significant applications in UAV swarm flight, such as humanitarian assistance, disaster relief, and space exploration. By utilizing event cameras for decentralized relative localization, our method achieves more efficient swarm flight in multi-robot systems.

Limitations & Outlook

During fast lateral motion, propellers may move out of the camera's field of view, leading to detection failures. Event cameras are sensitive to background noise, which may affect the accuracy of frequency estimation. In complex environments, the system's real-time performance may be limited.

Plain Language Accessible to non-experts

Imagine you're in a busy kitchen where chefs need to coordinate quickly to ensure every dish is served on time. Traditional cameras are like a photographer who takes a picture every second, capturing each chef's actions, but in a fast-changing environment, this method might miss a lot of details. An event camera, on the other hand, is like a keen observer who records every time there's movement. This way, even during the busiest moments in the kitchen, it captures every key action. Through this approach, event cameras help UAVs coordinate better in complex environments, just like chefs perfectly coordinating in a kitchen.

ELI14 Explained like you're 14

Imagine you're playing a team-based video game where each player needs to know their teammates' positions and actions to complete missions together. Traditional cameras are like a map that updates every so often, which might make you miss some important information. An event camera, however, is like a real-time updating map that notifies you every time there's a change. This way, you can better track your teammates' actions, ensuring smooth teamwork. That's what event cameras do in UAV flight, helping drones coordinate better, just like you and your teammates perfectly coordinating in a game.

Glossary

Event Camera

A sensor that detects changes in brightness in the scene, generating asynchronous event streams. Unlike traditional cameras, event cameras operate at microsecond-level temporal resolution, suitable for dynamic environments.

Used for UAV relative state estimation, providing low-latency and high dynamic range visual information.

Quadrotor UAV

A type of UAV equipped with four rotors, commonly used in research and applications. Quadrotors offer good stability and maneuverability, suitable for various tasks.

Used as the research subject to validate the application of event cameras in relative state estimation.

Relative State Estimation

In multi-robot systems, estimating the position and orientation of one robot relative to others. Relative state estimation is crucial for coordinated flight and collision avoidance.

Achieved through event camera and propeller frequency detection for UAVs.

Frequency Estimation

Determining the frequency of a signal by analyzing its periodic changes. In this paper, frequency estimation is used to detect the rotation speed of propellers.

Drives the kinematic state estimation module through propeller frequency estimation in the event stream.

Decentralized

A system architecture where components can operate independently without relying on central control. Decentralized systems are typically more robust and flexible.

Achieved through event cameras for UAV decentralized relative localization, reducing communication reliance.

Kinematic State Estimation

Estimating an object's motion state, including position, velocity, and acceleration. Kinematic state estimation is crucial for precise navigation and control.

Driven by propeller frequency as thrust input for the kinematic state estimation module.

Geometric Primitives

In computer vision, geometric primitives refer to simple geometric shapes like points, lines, and planes used to describe complex objects' shapes.

Used to estimate quadrotor orientation by fitting an ellipse.

Ellipse Fitting

A mathematical method for finding the best-fitting ellipse for a given set of points. In computer vision, ellipse fitting is often used for shape recognition and orientation estimation.

Used to estimate the direction of propellers in the event stream.

Backprojection

A method for mapping points on an image plane back to 3D space. In computer vision, backprojection is used to recover the 3D structure of objects.

Used to recover the body-frame tilt-axis of quadrotors through backprojection.

Dynamic Range

The ratio of the maximum to minimum signal intensity a sensor can detect. High dynamic range sensors can provide clear images in both bright and dark conditions.

Event cameras provide high dynamic range, suitable for visually challenging environments.

Open Questions Unanswered questions from this research

  • 1 The performance of event cameras in complex environments still requires further research, especially in scenarios with high background noise. Current methods may be affected by background noise, leading to inaccurate frequency estimation. More robust algorithms are needed to handle this noise.
  • 2 In multi-UAV scenarios, how to effectively coordinate relative state estimation among multiple UAVs remains an open question. Current methods focus on single UAV relative state estimation, and future work needs to extend to multi-UAV systems.
  • 3 The real-time performance of event cameras in complex environments may be limited. More efficient algorithms are needed to improve system real-time performance, especially in rapidly changing environments.
  • 4 The accuracy of propeller frequency estimation may be affected by viewing angles and occlusions. Research is needed to improve frequency estimation accuracy under different viewing angles and occlusion conditions.
  • 5 In UAV swarm flight, how to effectively utilize event cameras for decentralized relative localization still requires further research. Current methods have made some progress in reducing communication reliance, but further optimization is needed.

Applications

Immediate Applications

UAV Formation Flight

By using event cameras for decentralized relative localization, UAVs can achieve more efficient formation flight in complex environments, reducing communication reliance.

Disaster Relief

In disaster scenarios, UAVs can use event cameras for rapid localization and navigation, providing real-time spatial information to support rescue operations.

Space Exploration

In space exploration missions, UAVs can use event cameras for autonomous navigation, reducing reliance on orbiter spacecraft and increasing mission flexibility.

Long-term Vision

Smart City Surveillance

In smart cities, UAVs can use event cameras for real-time monitoring and data collection, supporting city management and security.

Agricultural Monitoring

In agriculture, UAVs can use event cameras for crop monitoring and data analysis, improving agricultural production efficiency and precision.

Abstract

Autonomous swarms of multi-Unmanned Aerial Vehicle (UAV) system requires an accurate and fast relative state estimation. Although monocular frame-based camera methods perform well in ideal conditions, they are slow, suffer scale ambiguity, and often struggle in visually challenging conditions. The advent of event cameras addresses these challenging tasks by providing low latency, high dynamic range, and microsecond-level temporal resolution. This paper proposes a framework for relative state estimation for quadrotors using event-based propeller sensing. The propellers in the event stream are tracked by detection to extract the region-of-interests. The event streams in these regions are processed in temporal chunks to estimate per-propeller frequencies. These frequency measurements drive a kinematic state estimation module as a thrust input, while camera-derived position measurements provide the update step. Additionally, we use geometric primitives derived from event streams to estimate the orientation of the quadrotor by fitting an ellipse over a propeller and backprojecting it to recover body-frame tilt-axis. The existing event-based approaches for quadrotor state estimation use the propeller frequency in simulated flight sequences. Our approach estimates the propeller frequency under 3% error on a test dataset of five real-world outdoor flight sequences, providing a method for decentralized relative localization for multi-robot systems using event camera.

cs.RO cs.CV eess.SY

References (20)

Perspective-1-Ellipsoid: Formulation, Analysis and Solutions of the Camera Pose Estimation Problem from One Ellipse-Ellipsoid Correspondence

Vincent Gaudillière, Gilles Simon, M. Berger

2022 9 citations ⭐ Influential View Analysis →

Density-Based Clustering Based on Hierarchical Density Estimates

R. Campello, D. Moulavi, J. Sander

2013 2364 citations

Drone Detection with Event Cameras

Gabriele Magrini, Lorenzo Berlincioni, Luca Cultrera et al.

2025 2 citations View Analysis →

EEPPR: event-based estimation of periodic phenomena rate using correlation in 3D

Jakub Kolář, Radim Spetlík, Jirí Matas

2024 3 citations View Analysis →

EV-Tach: A Handheld Rotational Speed Estimation System With Event Camera

Guangrong Zhao, Yiran Shen, Ning Chen et al.

2024 11 citations

Event-Based Vision: A Survey

Guillermo Gallego, T. Delbrück, G. Orchard et al.

2019 2470 citations View Analysis →

Count Every Rotation and Every Rotation Counts: Exploring Drone Dynamics via Propeller Sensing

Xuecheng Chen, Jingao Xu, Wenhua Ding et al.

2025 2 citations View Analysis →

UVDAR System for Visual Relative Localization With Application to Leader–Follower Formations of Multirotor UAVs

V. Walter, Nicolas Staub, A. Franchi et al.

2019 128 citations

A monocular pose estimation system based on infrared LEDs

Matthias Faessler, Elias Mueggler, K. Schwabe et al.

2014 131 citations

Multirotor Aerial Vehicles: Modeling, Estimation, and Control of Quadrotor

R. Mahony, Vijay R. Kumar, Peter Corke

2012 1497 citations

Direct least squares fitting of ellipses

A. Fitzgibbon, M. Pilu, Robert B. Fisher

1996 838 citations

HelixTrack: Event-Based Tracking and RPM Estimation of Propeller-like Objects

Radim Spetlík, Michal Pliska, V. Vrba et al.

2026 1 citations View Analysis →

Exploring Transient Phenomena in the Martian Atmosphere

Joshua Stadler, Hakan Kayal, Andreas Maurer et al.

2025 1 citations

Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios

A. Vidal, Henri Rebecq, Timo Horstschaefer et al.

2017 516 citations View Analysis →

Real Time Fiducial Marker Localisation System with Full 6 DOF Pose Estimation

Jiří Ulrich, Jan Blaha, Ahmad Alsayed et al.

2023 13 citations

Event-Based Visual-Inertial State Estimation for High-Speed Maneuvers

Xiuyuan Lu, Yi Zhou, Jiayao Mai et al.

2025 5 citations

TRIP: A Low-Cost Vision-Based Location System for Ubiquitous Computing

D. López-de-Ipiña, Paulo R. S. Mendonça, A. Hopper

2002 253 citations

Efficient Real-Time Quadcopter Propeller Detection and Attribute Estimation with High-Resolution Event Camera

Radim Spetlík, Tereza Uhrová, Jirí Matas

2025 4 citations

Large Sensors with Adaptive Shape Realised by Self-stabilised Compact Groups of Micro Aerial Vehicles

M. Saska

2017 8 citations

Towards Autorotation Landers for Communication and Sensor Networks on Mars

Clemens Riegler, Andreas Maurer, Hakan Kayal et al.

2025 1 citations