Multisite Evaluation Practice Lessons And Reflections From Four Cases

Download Multisite Evaluation Practice Lessons And Reflections From Four Cases PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Multisite Evaluation Practice Lessons And Reflections From Four Cases book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.
Multisite Evaluation Practice: Lessons and Reflections From Four Cases

Multisite evaluation settings differ from the single settings common to research on evaluation use. In addition to the primary intended users, there is another important group of potential evaluation users in settings where government agencies or large national or international foundations fund multisite projects: project leaders and local evaluators. If each project site is expected to take part in or support the overall program evaluation, then these individuals frequently serve as links between their projects and the larger cross-project evaluation of the funded program. The field has not, until now, address the topic of how being asked or required to participate in such evaluations affects these people who play a critical role in multisite evaluations. These issue does so in two ways. The first six chapters present data and related analyses from research on four multisite evaluations, documenting the patterns of invovlement in these evaluation projects and the extent to which different levels of involvment in program evluations resulted in different patterns of evaluation use and influence. The remaining chapters offer reflections on the results of the cases or their implications, some by people who were part of the original research and some by those who were not. The goal is to encourage readers to think actively about ways to improve multisite evaluation practice. This is the 129th volume of the Jossey-Bass quarterly report series New Directions for Evaluation, an official publication of the American Evaluation Association.
Interactive Evaluation Practice

You've taken your introduction to evaluation course and are about to do your first evaluation project. Where do you begin? Interactive Evaluation Practice: Managing the Interpersonal Dynamics of Program Evaluation helps bridge the gap between the theory of evaluation and its practice, giving students the specific skills they need to use in different evaluation settings. Jean A. King and Laurie Stevahn present readers with three organizing frameworks (derived from social interdependence theory from social psychology, evaluation use research, and the evaluation capacity building literature) for thinking about evaluation practice. These frameworks help readers track the various skills or strategies to use for distinctive evaluation situations. In addition, the authors provide explicit advice about how to solve specific evaluation problems. Numerous examples throughout the text bring interactive practice to life in a variety of settings.
Handbook of Practical Program Evaluation

Author: Kathryn E. Newcomer
language: en
Publisher: John Wiley & Sons
Release Date: 2015-08-06
The leading program evaluation reference, updated with the latest tools and techniques The Handbook of Practical Program Evaluation provides tools for managers and evaluators to address questions about the performance of public and nonprofit programs. Neatly integrating authoritative, high-level information with practicality and readability, this guide gives you the tools and processes you need to analyze your program's operations and outcomes more accurately. This new fourth edition has been thoroughly updated and revised, with new coverage of the latest evaluation methods, including: Culturally responsive evaluation Adopting designs and tools to evaluate multi-service community change programs Using role playing to collect data Using cognitive interviewing to pre-test surveys Coding qualitative data You'll discover robust analysis methods that produce a more accurate picture of program results, and learn how to trace causality back to the source to see how much of the outcome can be directly attributed to the program. Written by award-winning experts at the top of the field, this book also contains contributions from the leading evaluation authorities among academics and practitioners to provide the most comprehensive, up-to-date reference on the topic. Valid and reliable data constitute the bedrock of accurate analysis, and since funding relies more heavily on program analysis than ever before, you cannot afford to rely on weak or outdated methods. This book gives you expert insight and leading edge tools that help you paint a more accurate picture of your program's processes and results, including: Obtaining valid, reliable, and credible performance data Engaging and working with stakeholders to design valuable evaluations and performance monitoring systems Assessing program outcomes and tracing desired outcomes to program activities Providing robust analyses of both quantitative and qualitative data Governmental bodies, foundations, individual donors, and other funding bodies are increasingly demanding information on the use of program funds and program results. The Handbook of Practical Program Evaluation shows you how to collect and present valid and reliable data about programs.