Modern agricultural systems produce high-resolution data from remote sensing platforms, in-field sensors, and augmented machinery. However, these datasets often lack contextual information which hinders their utility in decision support systems and limits their applicability for AI-based modeling capacity. Digital metadata—the who, what, where, when, and how of field operations—are essential to transform other “layers of” raw data into actionable and interoperable agricultural knowledge. This paper presents Meta Ag, a smartphone-based metadata collection framework designed to improve the accuracy, completeness, and contextual richness of agricultural field records. The developed Android app integrates automated geofence-based event detection, operator identification, temporal logging, and structured input via dynamic interface and data validation elements. Its modular architecture supports authentication, automatic context generation, real-time validation, and centralized cloud storage. Meta Ag facilitates interoperability by exporting records in CSV, JSON, and RDF (Resource Description Framework) formats. Field evaluations show that the duration captured by Meta Ag differed from the actual recorded duration with a Root Mean Squared Error (RMSE) of 24.7s (range of 0s to 61s) and Meta Ag consistently detected all field access events via geofence triggers. These results highlight its effectiveness as a deployable, efficient solution for agricultural metadata collection. By reducing human error and supporting standardized, high-integrity recordkeeping, the Meta Ag framework enables the production of AI-ready metadata critical for digital agriculture applications.
https://www.sciencedirect.com/science/article/pii/S2772375525003065?via%3Dihub
