Biological Systems Engineering, Department of

 

Document Type

Article

Date of this Version

2001

Comments

Published in Transactions of the ASAE 44:1 (200), pp. 45–52. Copyright 2001 American Society Agricultural Engineers. Used by permission.

Abstract

Heavy reliance on chemical weed control in field crops of South Central Nebraska has resulted in the appearance of atrazine at concentrations greater than established drinking water standards. Our objective was to evaluate the best management practices for atrazine runoff for the tillage and herbicide management practices common to the region under study. Field experiments were performed to measure edge–of–field atrazine and water loss from disk–till, ridge–till, and slot plant (no–till) management systems. Results indicated less water runoff from no–till (34% less) and ridge–till (36% less) than from disk–till. Similarly, atrazine loss was also less: 24% less for no–till and 17% less for ridge–till than for disk–till. GLEAMS (Groundwater Loading Effect of Agricultural Management Systems) simulations were calibrated using field–measured inputs and verified against observed data from two independent sites. Fifteen different combinations of herbicide application and tillage practices were simulated using 50 years of rainfall data. Compared to pre–emergent broadcast + post application on corn with disk–till, annual reductions in simulated atrazine mass loss for the alternative practices ranged from 17% to 77%. The percent of annual atrazine lost ranged from 0.57% to 1.2%. During the 50–year simulation, annual losses from 7 to 10 years constituted >50% of the cumulative 50–year loss for broadcast and banded application. Based on recurrence interval evaluation, pre–emergent incorporation and pre–emergent banding were most effective at reducing long–term atrazine losses.

Share

COinS