Thrust 3, with focus on agricultural response systems, keeps at the forefront the end goal of improved decision making (both strategic and tactical) while realistically dealing with many data challenges facing agriculture. Namely, the data in agriculture involves a wide range of spatial/time scales and modalities. The sensor data comes from assorted hardware and software platforms, some commercial and some “research innovations”. The data will not have the same native resolution (i.e., samples will be collected at different locations and time scales) and will include varying spatial, temporal, and measurement uncertainty. One early challenge to be met is to make this data autonomous and interoperable. That interoperability must be with other data as well as with biophysical or simulation models.
Standard approaches to fuse sensor information generally require co-registering all data on a shared rasterized grid, which introduce inaccuracies during the image registration and data rasterization process, which can significantly impede the performance of follow-on machine learning methods. To address this issue, we will develop multiple-instance, multiple-resolution sensor fusion techniques that account for spatial and registration uncertainty during analysis and automate the alignment between data at varying resolutions and formats.