Natural Resources, School of

 

ORCID IDs

Ayse Kilic

Date of this Version

2011

Citation

Transactions Of The Asabe, Vol. 54(1): 67-80

Abstract

We evaluated the performance of four models for estimating soil heat flux density (G) in maize (Zea mays L.) and soybean (Glycine max L.) fields under different irrigation methods (center‐pivot irrigated fields at Mead, Nebraska, and subsurface drip irrigated field at Clay Center, Nebraska) and rainfed conditions at Mead. The model estimates were compared against measurements made during growing seasons of 2003, 2004, and 2005 at Mead and during 2005, 2006, and 2007 at Clay Center. We observed a strong relationship between the G and net radiation (Rn) ratio (G/Rn) and the normalized difference vegetation index (NDVI). When a significant portion of the ground was bare soil, G/Rn ranged from 0.15 to 0.30 and decreased with increasing NDVI. In contrast to the NDVI progression, the G/Rn ratio decreased with crop growth and development. The G/Rn ratio for subsurface drip irrigated crops was smaller than for the center‐pivot irrigated crops. The seasonal average G was 13.1%, 15.2%, 10.9%, and 12.8% of Rn for irrigated maize, rainfed maize, irrigated soybean, and rainfed soybean, respectively. Statistical analyses of the performance of the four models showed a wide range of variation in G estimation. The root mean square error (RMSE) of predictions ranged from 15 to 81.3 W m‐2. Based on the wide range of RMSE, it is recommended that local calibration of the models should be carried out for remote estimation of soil heat flux.

COinS