Presentations & Publications
The following is a list of selected presentations and presentations that members of The Evaluation Center have given. If you have any questions or would like more information, please feel free to contact us.
Perspective of the Healthcare Landscape for People Experiencing Homelessness in Denver (Public Health in the Rockies 2018)
Tracey O’Brien, Christine Velez, Jennifer Esala
As part of an evaluation of the impact of permanent supportive housing on individuals experiencing homelessness, The Evaluation Center explored the health landscape as it exists for this population. We conducted seventeen in-depth, semi-structured interviews with individuals who work in health-relevant fields that serve homeless populations. The interviews addressed health conditions, systems, and outcomes for homeless populations, with particular emphasis on common healthcare needs; access to care; barriers to care; continuity of care; and, incarceration. These interview data reflect the perspectives of administrators and services providers. While these perspectives are rich and essential to understanding health-relevant services for homeless individuals, they do not represent an objective, systematic view of the service landscape for homeless populations in Denver. The findings instead offer nuanced and informed perspectives on the health of homeless populations and the strengths and limitations of the current healthcare system.
The Evaluation Center shared these findings at the 2018 Public Health in the Rockies conference in Copper Mountain, Colorado, during a one-hour presentation on August 29, 2018.
Participatory Interpretation of Qualitative Data Using Data Placemats (AEA 2016)
Susan Connors, Bridget Nuechterlein, Katrina Marzetta, Liz Sweitzer
In this demonstration, we shared how we use data placemats to visualize qualitative data, often a more challenging task than visualizing quantitative results. As an example, we described how we prepared data placemats from interviews conducted with teacher candidates/new teachers as part of the evaluation of a Teacher Quality Partnership program and then engaged program leaders in participatory interpretation of the results. We demonstrated how to structure discussion sessions and offer innovative methods to involve participants in generating ways to use evaluation results for program improvement. We shared how we subsequently prepared a concise summary report using stakeholders’ interpretations. The lessons learned and feedback from participants concerning this collaborative approach were also shared.
Connors, S., Nuechterlein, B., Sweitzer, L., & Marzetta, K., Walters,B (2016, October). Participatory interpretation of qualitative data using data placemats. Demonstration presentation at the annual American Evaluation Association Conference, Atlanta, GA.
Building a “Super” Logic Model: Developing a System of Tiered Logic Models to Identify Key Outcomes in a Large Nonprofit Organization
Susan Connors, Joyce Schlose, Amelia Challender
Goodwill Denver is a nonprofit organization providing a multitude of community services to youth and disabled/disadvantaged citizens. The organization had a history of collecting a wide array of accountability data. To increase the organization’s ability to identify those key outcomes most central to their mission, evaluators worked collaboratively with staff members to develop a series of tiered logic models to describe the inputs and outcomes of each distinct program and organizational unit. Finally, a “super” logic model was synthesized to describe essential outcomes across all services. Evaluators will share the benefits and challenges of using this process for conducting a comprehensive program evaluation. A representative from Goodwill Denver will share the value of the resulting logic models for organizational learning.
Connors, S., Schlose, J. & Challender, A. (2011, November). Building a “super” logic model: Developing a system of tiered logic models to identify key outcomes in a large nonprofit organization. Paper presented at the annual meeting of the American Evaluation Association, Aneheim, CA.
Assessing Vital Signs: Applying Two Participatory Evaluation Frameworks to the Evaluation of a College of Nursing
Evaluation research has been in progress to clarify the concept of participatory evaluation and to assess its impact. Recently, two theoretical frameworks have been offered — Daigneault and Jacob’s participatory evaluation measurement index and Champagne and Smits’ model of practical participatory evaluation. In this case report, we apply these frameworks to test alignment with practitioner experience and to examine the degree to which they contribute to the understanding of the case. The context of the case report is an on-going program evaluation at a college of nursing believed to be an example of a successful participatory evaluation. Application of the participatory evaluation measurement index indicated the evaluation qualified as participatory at a minimal level historically and increased to a moderate level of participation after a re-design to involve an external evaluator. Ratings aligned with the intentional goals of evaluators. The model of practical participatory evaluation was found to be a good fit and descriptive of the case, although the planning and design processes may need to be added to the model. The exercise of applying the index and the P-PE model enlightened both evaluators and stakeholders concerning factors that contributed to the successful partnership and outcomes of the evaluation.
Connors, S. C. & Magilvy, J. K. (2011). Assessing vital signs: Applying two participatory evaluation frameworks to the evaluation of a college of nursing. Evaluation and Program Planning, 34, 79-86.
Applying Social Network Analysis to Examine Integration across Core Functions and Affiliate Organizations of the Colorado Clinical and Translational Sciences Institute (CCTSI)
R. Marc Brodersen, Jeffrey Proctor, Kathryn Nearing
A social network survey was administered to CCTSI core function personnel in August 2009, and again in September 2010. The survey examined the degree of integration across core functions and affiliate institutions as evidenced by individuals reporting being members of, and interacting with, multiple core functions. The survey also explored the extent to which personnel understand the role and functions of the various cores. This poster will present results from the survey and illustrate how integration and core-function personnel understanding has changed over time. Insights that emerged during processing discussions with CCTSI leadership will also be presented.
Brodersen, R.M., Proctor, J., Nearing, K. (2010, December). Applying Social Network Analysis to Examine Integration across Core Functions and Affiliate Organizations of the Colorado Clinical and Translational Sciences Institute (CCTSI). Poster presented at the national evaluation face-to-face meeting of the Clinical and Translational Sciences Awards, Rockville, MD.
The Colorado Clinical and Translational Sciences Institute (CCTSI) Evaluation and Tracking Core’s Progress Tracking System
Jeffrey Proctor & Kathryn Nearing
CCTSI’s progress tracking system includes electronic reporting forms, summary reports, stakeholder briefs and performance dashboards. Standardized forms help ensure that progress with respect to aims and indicators is documented and reported on an ongoing, regular basis. These and other system components not only help keep aims and indicators in the forefront, but also support the evaluators’ efforts to provide responsive formative feedback. Recent refinements to the system permit users to query archived data regarding the attainment of critical benchmarks for specific grant periods or timeframes designed “on the fly” to support progress monitoring and meet internal and external reporting requirements.
Proctor, J. & Nearing, K, (2010, December). The Colorado Clinical and Translational Sciences Institute (CCTSI) Evaluation and Tracking Core’s progress tracking system. Poster presented at the national evaluation face-to-face meeting of the Clinical and Translational Sciences Awards, Rockville, MD.
Making meaning: Participatory social network analysis.
Susan Connors, R. Marc Brodersen, Jeff Proctor
Sociograms developed through social network analyses graphically present evidence of interrelationships among individuals, programs, or disciplines. As part of a program evaluation, evaluators conducted social network analysis on archival data and prepared sociograms to depict the connectedness of biomedical investigators. The method was selected to investigate the interdisciplinary nature of the research teams before and after reorganization under the Clinical Translational Sciences Institute. To increase the relevance of such data and the likelihood that results will be used for program improvement, evaluators employed participatory evaluation techniques by involving key stakeholders in the analysis of sociograms. Evaluators interviewed program administrators concerning the resulting sociograms to gain their “insider knowledge” and to make meaning of the levels of interdisciplinary collaboration. Benefits and cautions of this mixed method approach are discussed.
Connors, S.C., Brodersen, M., Proctor, J. (2010, November). Making meaning: Participatory social network analysis. Paper presented at the American Evaluation Association National Conference, San Antonio, TX.
The Benefits and Challenges of Participatory Tracking Systems for Monitoring Institutional Change
R. Marc Brodersen, Kathryn Nearing, Susan Connors, Bonnie Walters
This paper we discusses best practices in setting up and utilizing program monitoring systems to track the progress of organizational change initiatives in such a way that also promotes participatory evaluation practices. Effective and efficient use of these systems can help evaluators and other stakeholders systematically track progress in reaching a large number of specific organization goals, while maintaining the flexibility to respond to changing situations and emerging issues. Evaluation professionals are often called upon to assist organizations as they implement complex structural and systemic changes. Assisting with the monitoring of these organizational changes can be difficult and time consuming. However, when done properly, it can promote deeper thought about program goals, theories of change, and achievable outcomes. Working collaboratively with clients to establish and continually refine organizational benchmarks and measurable outcomes (indicators) not only fosters accuracy in the monitoring system, but also promotes stakeholder buy-in and collaboration.
Brodersen, R. M., Nearing, K., Connors, S., & Walters, B. (2010, November). The benefits and challenges of participatory tracking systems for monitoring institutional change. Paper presented at the annual meeting of the American Evaluation Association, San Antonio, TX.
Assessing vital signs: Participatory evaluation of a college of nursing’s programs and Improvement process
Recently two evaluation research groups have offered frameworks to clarify the concept of participatory evaluation. In this article, these frameworks are applied to an active program evaluation at the University of Colorado Denver’s College of Nursing. The application of the metric developed by Daigneault and Jacob (2009) indicated the project had sufficient rating to be considered participatory and the degree of stakeholder participation increased with an external evaluation team. Smits and Champagne’s (2008) model of practical participatory evaluation aligned well with the project, although planning and dissemination components may need to be strengthened in the model, based on this case. Other contextual variables presenting challenges and contributing to the effectiveness of this evaluation are discussed.
Connors, S. C. & Magilvy, J. K. (2009, November). Assessing vital signs: Participatory evaluation of a college of nursing’s programs and Improvement process. Paper presented at the American Evaluation Association National Conference, Orlando, FL.
From an evaluator’s perspective: Comparison of methods of assessing fidelity in education
Determining fidelity of implementation is an essential component of an evaluation study that hopes to ‘connect the dots’ between an innovative educational practice and intended outcomes, such as improvement in student achievement, or to compare the effectiveness of an innovation to traditional practices. Panelists will provide insight into current methods of assessing the degree to which innovative instructional strategies are consistently and accurately implemented in classroom practice according to the original design. The merits and challenges of each of the methods will be examined, alternative methods drawn from the literature will be compared, and an interactive discussion concerning the experiences of participants will be facilitated. The purpose of the panel is to advance the evolution and refinement of methods of assessing fidelity of implementation in the field of education.
Connors, S. (2008, November). From an evaluator’s perspective: Comparison of methods of assessing fidelity in education. Panel presented to the American Evaluation Association National Conference, Denver, CO.
Cognitive apprenticeship for novice evaluators: Application of theory to practice
This paper describes the on-going efforts of one university-based evaluation center to apply the model of cognitive apprenticeship (Collins, Brown, & Newman, 1989) to support the development of expertise in novice evaluators. No prior applications of cognitive apprenticeship to the field of evaluation were uncovered in a search of literature. A logic model is presented from a participant observer perspective that aligns cognitive apprenticeship with current practices in the Evaluation Center at the University of Colorado, Denver. Feedback from the team of evaluators who use this approach provides preliminary support for the effectiveness of this model to promote the understanding of the nature of expert practice in professional evaluation.
Connors, S. (2007, November). Cognitive apprenticeship for novice evaluators: Application of theory to practice. Paper presented at the American Evaluation Association National Conference, Baltimore, MD.