Computer Vision · Small-Object Detection · Spatiotemporal Modeling

STARD-Net

STARD-Net: Spatio-Temporal Attention Residual Dilated Network for UAV-Based Airborne Object Detection

ACM Transactions on Spatial Algorithms and Systems · 2026

Overview

STARD-Net addresses tiny airborne object detection from moving UAV cameras, where targets are small, visually weak, and frequently affected by clutter, camera motion, camouflage, and partial occlusion.

Problem Statement

Detecting airborne objects from drone-mounted cameras is difficult because the target may occupy only a few pixels, background clutter can dominate the frame, and camera/target motion creates unstable visual evidence.

Motivation

Frame-level detection can fail when the object is faint or partially occluded. STARD-Net uses temporal context and attention-driven feature modeling to improve weak target recovery.

Methodology

The method combines spatiotemporal modeling, attention-based feature refinement, residual/dilated feature extraction, and temporal context for weak target recovery.

Key Contributions

  • Spatiotemporal attention for tiny airborne object detection from moving drones.
  • Feature refinement for weak and cluttered visual evidence.
  • Temporal context modeling for improved robustness under motion and occlusion.
  • UAV-based perception setting involving small targets and challenging backgrounds.

Experimental Setup

Input modality: UAV camera imagery and video sequences. Task: tiny airborne object detection under moving-camera conditions. See the paper for dataset details, baselines, metrics, and full evaluation protocol.

Quantitative Results

See the paper for the full quantitative evaluation.

Demo / Media

My Role

My contributions include problem formulation, model development, implementation, experimental evaluation, result analysis, and manuscript preparation.

Related Publication

M. H. Rahman and S. Madria. "STARD-Net: SpatioTemporal Attention for Robust Detection of Tiny Airborne Objects from Moving Drones." ACM Transactions on Spatial Algorithms and Systems, 12(1), 1-48, 2026.