George, D.A. and Keogh, D.U. and Buckley, D. and Mavi, H. (2003) Theoretical framework for applied climate education: 3. Evaluating applied climate research, development and extension processes and their outcomes in agriculture. Rainman StreamFlow version 4.3: a comprehensive climate and streamflow analysis package on CD to assess seasonal forecasts and manage climate risk. (QI03040) . pp. 1-26.
Australian research, development and extension (RD&E) programs and projects are increasingly being asked to justify their activities based on the outcomes of dollars spent (Collins 1996). Evaluating only outputs using indicators such as number of workshops held, number of attendees and papers written are no longer seen as sufficient and relevant because they provide little indication of RD&E impact to funding bodies, managers, extension agents or project staff (White 2000; Robertson and White 1996). What impact has disseminated applied climate research had on target audience management and decision-making, has it been useful, what areas of applied climate research, development or extension need improvement or new directions?
Evaluating applied climate RD&E in Australia is a relatively new philosophy and management tool. Benefits from the process can include enhancing program or project success and client focus, development of new ideas; and gathering of evidence of impact and application of research to real-world decisions. Deciding what, who, how and when to evaluate to collect reliable and accurate information requires effort, creative planning and may involve assessing outcomes in terms of changes in producer knowledge, attitude, aspirations, skills, practice change and end results. For example, what information is important to producer decisions, how do they want to access it and how often, what scientific concepts do they need to understand to interpret and apply climate-based information? Do we know what are the necessary pre-requisite skills and knowledge to use applied climate information effectively? Evaluation activities may vary from a simple approach, such as using a ‘five question one-page’ workshop evaluation sheet to presenting a range of resource management research activities on a computer lap-top at farmer kitchen tables (Carroll et. al. 2001), rigorous surveys using random sampling (Keogh et. al. 2002b) or census of climate workshop participants (Colmar Brunton 1999a, 1999b). These are some general considerations to bear in mind when evaluating climate RD&E.
This paper provides some background on current applied climate RD&E evaluation. It presents a selection of recent evaluation case studies in Australia and general guidelines for designing internal RD&E evaluations to help improve program and project performance or gather evidence of impact. An approach to consider for developing evaluation is also included. It assumes the reader already agrees with the proposition that evaluation is essential. A survey template used to evaluate climate workshops based on Bennett’s hierarchy (Bennett 1976) is also presented in Appendix 1.
This is an edited version of the paper presented at the International Conference on Applying seasonal climate forecasts in agriculture, 24-26 September 2002., Tamil Nadu Agricultural University, Tamil Nadu, INDIA.
|Additional Information:||CD © The State of Queensland, Department of Primary Industries & Fisheries. Copyright protects this publication. Except for purposes permitted by the Copyright Act 1968, reproduction by whatever means is prohibited without prior written permission of the Department of Primary Industries & Fisheries, Queensland.|
|Keywords:||Climate research; extension; evaluation.|
|Subjects:||Agriculture > Agriculture (General) > Agricultural meteorology. Crops and climate|
Agriculture > Agriculture (General) > Agricultural education > Agricultural extension work
|Deposited On:||11 Dec 2003|
|Last Modified:||15 Sep 2010 02:03|
Repository Staff Only: item control page