Using Hands-on Activities to Teach Land Application of Manure
Date of this Version
Johnson, L. J. 2019. Using Hands‐On Activities to Teach Land Application of Manure. In Proceedings of Waste to Worth 2019. April 22–26, 2019. Minneapolis, MN. https://lpelc.org/using-hands-on-activities-to-teach-land-application-of-manure
Many states have regulations that require education for livestock producers and manure applicators. Adults that must attend these types of programs are often there solely to fulfill requirements and are not willing learners. While regulations may specify topics that must be addressed, most do not spell out teaching methods for these educational programs. It is well known that active learning promotes better retention of the material. In Nebraska however, these programs traditionally have been a combination of pre-recorded and live PowerPoint presentations as they are easier to develop and for educators that may not be manure experts to host. In recent years, the Nebraska Animal Manure Management team has been working to make their manure training program more interactive. This workshop highlights hands-on activities related to odor management, stockpiling and transporting manure, and equipment calibration. Audience members are encouraged to bring examples of hands-on activities that they are using to share with others.
Purpose — One example of an interactive teaching tool. Shoebox calibration kits allow participants to simulate a manure calibration in the classroom.
The objective of this workshop is to encourage idea-sharing and collaboration in the development of activities and teaching techniques to better manure-related programming across state lines.
Lessons Learned in Nebraska in 2019 — Hands-on activities have enhanced our programming in Nebraska by increasing participation during our training events. Participants can no longer sit back and watch videos (or pretend to watch videos). While we do not require testing to receive certification, we feel that we have really improved our program. We received more written feedback about the program in the “comments” section of the evaluation and often received praise for the instructors, which we had never gotten before. For most of the activities that we made major changes to, there was about a 20% improvement in the number of attendees that selected moderately high to significant knowledge improvement (3 or 4 on a scale of 0-4) when compared to the previous year’s evaluation results. Because on average we also had a 13% improvement for activities that were not drastically changed, this result may be skewed, but is still an interesting change. Looking at the data makes one wonder whether the increased interaction between and amongst participants and instructors resulted in higher marks overall because participants were generally more satisfied with the program – even those parts that were not changed.