Malakhi Hopkins1, Alice Li1, Shobhita Kramadhati1, Jackson Arnold2, Akhila Mallavarapu1, Chavez Lawrence1, Varun Murali1, Sanjeev J Koppal2, Cherie R Kagan1, Vijay Kumar1
1GRASP Laboratory, University of Pennsylvania, Pennsylvania, USA
2University of Florida, Florida, USA
3Amazon Robotics. Sanjeev J. Koppal holds concurrent appointments as an Associate Professor of ECE at the University of Florida and as an Amazon Scholar at Amazon Robotics. This paper describes work performed at the University of Florida and is not associated with Amazon.
Common remote sensing modalities (RGB, multi-spectral, hyperspectral imaging or LiDAR) are often used to indirectly measure crop health and do not directly capture plant stress indicators. Commercially available direct leaf sensors are bulky, powered electronics that are expensive and interfere with crop growth. In contrast, low-cost, passive and bio-degradable leaf sensors offer an opportunity to advance real-time monitoring as they directly interface with the crop surface while not interfering with crop growth. To this end, we co-design a sensor-detector system, where the sensor is a passive colorimetric leaf sensor that directly measures crop health in a precision agriculture setting, and the detector autonomously obtains optical signals from these leaf sensors. The detector comprises a low size weight and power (SWaP) mobile ground robot with an onboard monocular RGB camera and object detector to localize each leaf sensor, as well as a hyperspectral camera with a motorized mirror and halogen light to acquire hyperspectral images. The sensor’s crop health-dependent optical signals can be extracted from the hyperspectral images. The proof-of-concept system is demonstrated in row-crop environments both indoors and outdoors where it is able to autonomously navigate, locate and obtain a hyperspectral image of all leaf sensors present, and acquire interpretable spectral resonance with 80% accuracy within a required retrieval distance from the sensor.
The proposed system consists of three key components. First, a colorimetric leaf sensor, capable of producing a measurable spectral resonance, is affixed directly to the surface of a plant. Second, a ground robot autonomously navigates through the row-crop environment, using an onboard RGB camera and object detector to locate the sensor. Finally, an onboard hyperspectral imaging system, integrating a motorized mirror and halogen illumination, captures detailed spectral data at the sensor’s location, enabling accurate retrieval of the sensor’s optical signature.
Experiments were conducted across three distinct environments: a structured indoor environment, an unstructured outdoor environment, and a structured outdoor environment.
An object detector was trained to reliably identify the colorimetric leaf sensors across all experimental environments.
@misc{hopkins2025roboticmonitoringcolorimetricleaf,
title={Robotic Monitoring of Colorimetric Leaf Sensors for Precision Agriculture},
author={Malakhi Hopkins and Alice Kate Li and Shobhita Kramadhati and Jackson Arnold and Akhila Mallavarapu and Chavez Lawrence and Varun Murali and Sanjeev J. Koppal and Cherie Kagan and Vijay Kumar},
year={2025},
eprint={2505.13916},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2505.13916},
}