A phenolgy-guided deep learning model for early crop mapping at the field level
Topics: Remote Sensing
, Geographic Information Science and Systems
, Agricultural Geography
Keywords: remote sensing, deep learning, crop mapping, crop phenology
Session Type: Virtual Paper Abstract
Day: Friday
Session Start / End Time: 2/25/2022 11:20 AM (Eastern Time (US & Canada)) - 2/25/2022 12:40 PM (Eastern Time (US & Canada))
Room: Virtual 38
Authors:
Zijun Yang, University of Illinois at Urbana-Champaign
Chunyuan Diao, University of Illinois at Urbana-Champaign
,
,
,
,
,
,
,
,
Abstract
Timely and accurate crop type mapping is critical for a variety of agricultural applications, especially for near-real-time applications such as crop yield forecasting. Yet the spatial and temporal variations in crop growth patterns poses significant challenges to in-season crop type mapping models that employ historical data to serve as training dataset. In this study, we propose an innovative modeling strategy which incorporates early in-season crop phenology information extracted from time-series satellite observations to construct a robust and scalable model for early crop mapping. The within-season emergence (WISE) algorithm is employed in this study to detect crop phenological change patterns during the early time in the current growth season. The phenology information is then used to normalize the crop growth patterns over different years and regions. A hybrid deep learning spatiotemporal fusion model is utilized to generate daily satellite images at a 30-m spatial resolution. Field-level crop growth patterns are subsequently extracted from the fusion dataset. A variety of machine learning/deep learning methods are tested in different years over different regions for the accuracy of prediction and the scalability of the models. The phenology-guided modeling design yields promising results in early prediction of crop types over space and time.
A phenolgy-guided deep learning model for early crop mapping at the field level
Category
Virtual Paper Abstract
Description
This abstract is part of a session. Click here to view the session.
| Slides