Download Free The Logic Of Evaluation Book in PDF and EPUB Free Download. You can read online The Logic Of Evaluation and write the review.

Written for students, researchers, consultants, professionals, and scholars, Logic Modeling Methods in Program Evaluation provides a step-by-step explanation of logic modeling and its importance in connecting theory with implementation and outcomes in program evaluation in the social sciences. Logic Modeling Methods in Program Evaluation’s comprehensive approach: · Shows how the logic model can be used to clarify assumptions, reach group consensus, and bridge between various disciplines; · Identifies the specific components of logic modeling and discusses inputs, activities, outputs, short and long-term outcomes, contextual factors, and optional components and variations; · Focuses on the critical role of consensus in the development of a logic model, including factors such as teaming, timing, and collaborative learning; · Explores issues such as the most common mistakes in the development and implementation of logic models and the use of logic models in projects with different levels of complexity or different durations; · Discusses how the logic model can be used to identify the range of outcomes to be examined and how summative questions can be prioritized; · Includes a variety of illustrative case studies of simple, multi-year, and complex projects and demonstrates the use of the logic model to establish project goals and evaluate the project’s outcome.
The Logic Model Guidebook offers clear, step-by-step support for creating logic models and the modeling process in a range of contexts. Lisa Wyatt Knowlton and Cynthia C. Phillips describe the structures, processes, and language of logic models as a robust tool to improve the design, development, and implementation of program and organization change efforts. The text is enhanced by numerous visual learning guides (sample models, checklists, exercises, worksheets) and many new case examples. The authors provide students, practitioners, and beginning researchers with practical support to develop and improve models that reflect knowledge, practice, and beliefs. The Guidebook offers a range of new applied examples. The text includes logic models for evaluation, discusses archetypes, and explores display and meaning. In an important contribution to programs and organizations, it emphasizes quality by raising issues like plausibility, feasibility, and strategic choices in model creation.
Evaluation Methodology Basics introduces evaluation by focusing on the main kinds of 'big picture' questions that evaluations usually need to answer, and how the nature of such questions are linked to evaluation methodology choices. The author: shows how to identify the right criteria for your evaluation; discusses how to objectively figure out which criteria are more important than the others; and, delves into how to combine a mix of qualitative and quantitative data with 'relevant values' (such as needs) to draw explicitly evaluative conclusions.
This engaging text takes an evenhanded approach to major theoretical paradigms in evaluation and builds a bridge from them to evaluation practice. Featuring helpful checklists, procedural steps, provocative questions that invite readers to explore their own theoretical assumptions, and practical exercises, the book provides concrete guidance for conducting large- and small-scale evaluations. Numerous sample studies—many with reflective commentary from the evaluators—reveal the process through which an evaluator incorporates a paradigm into an actual research project. The book shows how theory informs methodological choices (the specifics of planning, implementing, and using evaluations). It offers balanced coverage of quantitative, qualitative, and mixed methods approaches. Useful pedagogical features include: *Examples of large- and small-scale evaluations from multiple disciplines. *Beginning-of-chapter reflection questions that set the stage for the material covered. *"Extending your thinking" questions and practical activities that help readers apply particular theoretical paradigms in their own evaluation projects. *Relevant Web links, including pathways to more details about sampling, data collection, and analysis. *Boxes offering a closer look at key evaluation concepts and additional studies. *Checklists for readers to determine if they have followed recommended practice. *A companion website with resources for further learning.
Evaluation Failures: 22 Tales of Mistakes Made and Lessons Learned is a candid collection of stories from seasoned evaluators from a variety of sectors sharing professional mistakes they have made in the past, and what they learned moving forward. As the only book of its kind, editor Kylie Hutchinson has collected a series of engaging, real-life examples that are both entertaining and informative. Each story offers universal lessons as takeaways, and discussion questions for reflective practice. The book is the perfect companion to anyone working in the evaluation field, and to instructors of program evaluation courses who want to bring the real world into their classroom.
Demystifying the evaluation journey, this is the first evaluation mentoring book that addresses the choices, roles, and challenges that evaluators must navigate in the real world. Experienced evaluator and trainer Donna R. Podems covers both conceptual and technical aspects of practice in a friendly, conversational style. She focuses not just on how to do evaluations but how to think like an evaluator, fostering reflective, ethical, and culturally sensitive practice. Extensive case examples illustrate the process of conceptualizing and implementing an evaluation--clarifying interventions, identifying beneficiaries, gathering data, discussing results, valuing, and developing recommendations. The differences (and connections) between research, evaluation, and monitoring are explored. Handy icons identify instructive features including self-study exercises, group activities, clarifying questions, facilitation and negotiation techniques, insider tips, advice, and resources. Purchasers can access a companion website to download and print reproducible materials for some of the activities and games described in the book.
Between good intentions and great results lies a program theory not just a list of tasks but a vision of what needs to happen, and how. Now widely used in government and not-for-profit organizations, program theory provides a coherent picture of how change occurs and how to improve performance. Purposeful Program Theory shows how to develop, represent, and use program theory thoughtfully and strategically to suit your particular situation, drawing on the fifty-year history of program theory and the authors' experiences over more than twenty-five years. "From needs assessment to intervention design, from implementation to outcomes evaluation, from policy formulation to policy execution and evaluation, program theory is paramount. But until now no book has examined these multiple uses of program theory in a comprehensive, understandable, and integrated way. This promises to be a breakthrough book, valuable to practitioners, program designers, evaluators, policy analysts, funders, and scholars who care about understanding why an intervention works or doesn't work." Michael Quinn Patton, author, Utilization-Focused Evaluation "Finally, the definitive guide to evaluation using program theory! Far from the narrow 'one true way' approaches to program theory, this book provides numerous practical options for applying program theory to fulfill different purposes and constraints, and guides the reader through the sound critical thinking required to select from among the options. The tour de force of the history and use of program theory is a truly global view, with examples from around the world and across the full range of content domains. A must-have for any serious evaluator." E. Jane Davidson, PhD, Real Evaluation Ltd. Companion Web site: josseybass.com/go/funnellrogers
Program Evaluation and Performance Measurement: An Introduction to Practice, Second Edition offers an accessible, practical introduction to program evaluation and performance measurement for public and non-profit organizations, and has been extensively updated since the first edition. Using examples, it covers topics in a detailed fashion, making it a useful guide for students as well as practitioners who are participating in program evaluations or constructing and implementing performance measurement systems. Authors James C. McDavid, Irene Huse, and Laura R. L. Hawthorn guide readers through conducting quantitative and qualitative program evaluations, needs assessments, cost-benefit and cost-effectiveness analyses, as well as constructing, implementing and using performance measurement systems. The importance of professional judgment is highlighted throughout the book as an intrinsic feature of evaluation practice.
This book begins with the context of an agency-based evaluation and describes the method within that context. Students will gain a more complete understanding of this contextual challenge and will learn techniques for operating in the face of these challenges.