The Program Evaluation KnowledgeBase is an online resource aiding education professionals in understanding the basics of project evaluation in order to properly assess projects and programs. It is organized around three elements to assist educators with their program evaluation.
Purpose: Planning how to conduct a program evaluation is the essessential the first step. The preparatory thinking involves understanding the program being evaluated, organizing an evaluation team, and determining how to conduct the evaluation. Element 1 outlines the pre-planning tasks.
Purpose: Conducting the evaluation involves designing data collection so the analysis and interpretation will answer the questions the evaluation sets out to resolve. When developing and implementing the evaluation design be flexible to collect and analyze data from many perspectives. The collected data should be attentive to the evaluation questions. Element 2 outlines the tasks associated with implementing the evaluation.
Guideline: Once the evaluation is completed the next task is to compile the findings and recommendations report. Doing so is necessary to have a document of record to communicate with constituency groups and monitor performance. The evaluation team might assign the task to a sub group or designate one individual to create a draft document for review. As many constituencies may view the report it is helpful to utilize an easy to follow format and minimize the use of technical language and buzz words.
This tool from The Community Toolbox is as an example of the type of content found in a typical evaluation report.
Edward Tufte, author of The Visual Display of Quantitative Information suggests eight practices to consider to assure the accurate representation of visual data.
This resource describes in "plain English, some basic concepts in statistics that every writer should know."
The Fog index was developed by Robert Gunning to measure how hard something is to read. His Fog Index in The Technique of Clear Writing (McGraw-Hill) is considered the most reliable formula for testing your writing. It is not an index of how good your writing is, but of how easy it is to understand. Using the index, grant administrators can test the communications they send to constituent groups.
This six page document is one of eighteen lectures based on the book Educational Research Quantitative, Qualitative, and Mixed Approaches. The lectures were written by the author as a supplement to the text. This lecture describes the major parts of a research report and offers advice on writing the report itself.
This link is to the companion website to the book Just Plain Data Analysis authored by Gary Klass, Department of Politics and Government, Illinois State University. The website addresses research design, data collection, data analysis, and data presentation employed in empirical social science research.
Purpose: The evaluation's findings and recommendation have limited value unless they are shared with the stakeholders and utilized to improve the evaluated program. Using the results to improve the evaluated program and communicating with constituencies are activities that occur in parallel. Element 3 outlines the tasks associated with using the results.