Presentations & Publications

The following is a list of selected presentations and presentations that members of The Evaluation Center have given. If you have any questions or would like more information, please feel free to contact us.

Participatory Interpretation of Qualitative Data Using Data Placemats (AEA 2016)

Susan Connors, Bridget Nuechterlein, Katrina Marzetta, Liz Sweitzer

In this demonstration, we shared how we use data placemats to visualize qualitative data, often a more challenging task than visualizing quantitative results. As an example, we described how we prepared data placemats from interviews conducted with teacher candidates/new teachers as part of the evaluation of a Teacher Quality Partnership program and then engaged program leaders in participatory interpretation of the results. We demonstrated how to structure discussion sessions and offer innovative methods to involve participants in generating ways to use evaluation results for program improvement. We shared how we subsequently prepared a concise summary report using stakeholders’ interpretations. The lessons learned and feedback from participants concerning this collaborative approach were also shared.

AEA PowerPoint Presentation

Sample Placemat

Blank Placemat

Relevant Literature:

daigneault-jacob-article

pankaj-emery-article

pietilainen-2012

Citation:
Connors, S., Nuechterlein, B., Sweitzer, L., & Marzetta, K., Walters,B (2016, October). Participatory interpretation of qualitative data using data placemats. Demonstration presentation at the annual American Evaluation Association Conference, Atlanta, GA.


Building a “Super” Logic Model:  Developing a System of Tiered Logic Models to Identify Key Outcomes in a Large Nonprofit Organization

Susan Connors, Joyce Schlose, Amelia Challender

Goodwill Denver is a nonprofit organization providing a multitude of community services to youth and disabled/disadvantaged citizens.  The organization had a history of collecting a wide array of accountability data.  To increase the organization’s ability to identify those key outcomes most central to their mission, evaluators worked collaboratively with staff members to develop a series of tiered logic models to describe the inputs and outcomes of each distinct program and organizational unit.  Finally, a “super” logic model was synthesized to describe essential outcomes across all services.  Evaluators will share the benefits and challenges of using this process for conducting a comprehensive program evaluation.  A representative from Goodwill Denver will share the value of the resulting logic models for organizational learning.

Citation:
Connors, S., Schlose, J. & Challender, A. (2011, November). Building a “super” logic model:  Developing a system of tiered logic models to identify key outcomes in a large nonprofit organization. Paper presented at the annual meeting of the American Evaluation Association, Aneheim, CA.


Evaluation of the Colorado Clinical and Translational Science Institute’s (CCTSI) Leadership in Innovative Team Science (LITeS) Program: A Mixed-Methods Approach

R. Marc Brodersen
This paper details the mixed-methods approach used to evaluate the CCTSI’s Leadership in Innovative Team Science (LITeS) program. The program was designed to provide leadership, teamwork, and mentoring training to principal investigators and program directors of federally funded T32 and K12 training programs, as well as relevant deans of the university. The training included eight full-day workshops scheduled in two-day blocks that spanned the academic calendar from September to May. The evaluation consisted of surveys administered to all participants at the end of each training block, as well as a one-year follow-up. These surveys were designed to assess knowledge gained and utilized in reference to domains relevant to NIH’s Roadmap to Translational Research. Structured interviews were then conducted with participating deans to determine the effectiveness of the program in developing skills in these domains, and the value of the program in training leaders in medical research at the university.

Citation:
Brodersen, R. M., Libby, A. (2011, November). Evaluation of the Colorado Clinical and Translational Science Institutes’s (CCTSI) Leadership in Innovative Team Science (LITeS) program: A mixed-methods approach. Paper presented at the annual meeting of the American Evaluation Association, Aneheim, CA.


Assessing Vital Signs: Applying Two Participatory Evaluation Frameworks to the Evaluation of a College of Nursing

Susan Connors
Evaluation research has been in progress to clarify the concept of participatory evaluation and to assess its impact. Recently, two theoretical frameworks have been offered — Daigneault and Jacob’s participatory evaluation measurement index and Champagne and Smits’ model of practical participatory evaluation. In this case report, we apply these frameworks to test alignment with practitioner experience and to examine the degree to which they contribute to the understanding of the case. The context of the case report is an on-going program evaluation at a college of nursing believed to be an example of a successful participatory evaluation. Application of the participatory evaluation measurement index indicated the evaluation qualified as participatory at a minimal level historically and increased to a moderate level of participation after a re-design to involve an external evaluator. Ratings aligned with the intentional goals of evaluators. The model of practical participatory evaluation was found to be a good fit and descriptive of the case, although the planning and design processes may need to be added to the model. The exercise of applying the index and the P-PE model enlightened both evaluators and stakeholders concerning factors that contributed to the successful partnership and outcomes of the evaluation.

Citation:
Connors, S. C. &  Magilvy, J. K. (2011). Assessing vital signs: Applying two participatory evaluation frameworks to the evaluation of a college of nursing. Evaluation and Program Planning, 34, 79-86.


Applying Social Network Analysis to Examine Integration across Core Functions and Affiliate Organizations of the Colorado Clinical and Translational Sciences Institute (CCTSI)

R. Marc Brodersen, Jeffrey Proctor, Kathryn Nearing
A social network survey was administered to CCTSI core function personnel in August 2009, and again in September 2010. The survey examined the degree of integration across core functions and affiliate institutions as evidenced by individuals reporting being members of, and interacting with, multiple core functions. The survey also explored the extent to which personnel understand the role and functions of the various cores. This poster will present results from the survey and illustrate how integration and core-function personnel understanding has changed over time. Insights that emerged during processing discussions with CCTSI leadership will also be presented.

Citation:
Brodersen, R.M., Proctor, J., Nearing, K. (2010, December). Applying Social Network Analysis to Examine Integration across Core Functions and Affiliate Organizations of the Colorado Clinical and Translational Sciences Institute (CCTSI). Poster presented at the national evaluation face-to-face meeting of the Clinical and Translational Sciences Awards, Rockville, MD.


Case Study: An Approach to Examining Team Science in the Age of Translation

Kathryn Nearing
The Colorado Pilot and Collaborative Translational and Clinical Studies (Co-Pilot) component of the Colorado Clinical and Translational Sciences Institute (CCTSI) funds “team science” awards through its pilot grant program. The CCTSI evaluation team reviewed literature to identify key characteristics of “team science” and interviewed awardees as they began their projects and again at the end of the funding period. This poster will summarize the current state of the literature and present results from case studies – specifically, findings that have emerged regarding the ways that cross-disciplinary team-based research may enhance opportunities for translation.

Citation:
Nearing, K. (2010, December). Case Study: An approach to examining team science in the age of translation. Poster presented at the national evaluation face-to-face meeting of the Clinical and Translational Sciences Awards, Rockville, MD.


Research informed by a Life-Course Framework: the Challenges and Opportunities

Kathryn Nearing
The Child and Maternal Health (CMH) component of the Colorado Clinical and Translational Sciences Institute (CCTSI) aims to promote life-course research through its pilot grant program. The CCTSI evaluation team reviewed literature to develop an operational definition of life-course research. Evaluators distilled five dimensions of a life-course framework; this operationalization informed the development of criteria that guided the selection of two illustrative case studies from among the first cohort of CMH pilot projects. This poster will feature the life-course framework, case study methodology, and key insights emerging from case studies.

Citation:
Nearing, K. (2010, December). Research informed by a life-course framework: the challenges and opportunities. Poster presented at the national evaluation face-to-face meeting of the Clinical and Translational Sciences Awards, Rockville, MD.


The Colorado Clinical and Translational Sciences Institute (CCTSI) Evaluation and Tracking Core’s Progress Tracking System

Jeffrey Proctor & Kathryn Nearing
CCTSI’s progress tracking system includes electronic reporting forms, summary reports, stakeholder briefs and performance dashboards. Standardized forms help ensure that progress with respect to aims and indicators is documented and reported on an ongoing, regular basis. These and other system components not only help keep aims and indicators in the forefront, but also support the evaluators’ efforts to provide responsive formative feedback. Recent refinements to the system permit users to query archived data regarding the attainment of critical benchmarks for specific grant periods or timeframes designed “on the fly” to support progress monitoring and meet internal and external reporting requirements.

Citation:
Proctor, J. & Nearing, K, (2010, December). The Colorado Clinical and Translational Sciences Institute (CCTSI) Evaluation and Tracking Core’s progress tracking system. Poster presented at the national evaluation face-to-face meeting of the Clinical and Translational Sciences Awards, Rockville, MD.


Making meaning: Participatory social network analysis.

Susan Connors, R. Marc Brodersen, Jeff Proctor
Sociograms developed through social network analyses graphically present evidence of interrelationships among individuals, programs, or disciplines. As part of a program evaluation, evaluators conducted social network analysis on archival data and prepared sociograms to depict the connectedness of biomedical investigators. The method was selected to investigate the interdisciplinary nature of the research teams before and after reorganization under the Clinical Translational Sciences Institute. To increase the relevance of such data and the likelihood that results will be used for program improvement, evaluators employed participatory evaluation techniques by involving key stakeholders in the analysis of sociograms. Evaluators interviewed program administrators concerning the resulting sociograms to gain their “insider knowledge” and to make meaning of the levels of interdisciplinary collaboration. Benefits and cautions of this mixed method approach are discussed.

Citation:
Connors, S.C., Brodersen, M., Proctor, J. (2010, November). Making meaning: Participatory social network analysis. Paper presented at the American Evaluation Association National Conference, San Antonio, TX.


The Benefits and Challenges of Participatory Tracking Systems for Monitoring Institutional Change

R. Marc Brodersen, Kathryn Nearing, Susan Connors, Bonnie Walters
This paper we discusses best practices in setting up and utilizing program monitoring systems to track the progress of organizational change initiatives in such a way that also promotes participatory evaluation practices. Effective and efficient use of these systems can help evaluators and other stakeholders systematically track progress in reaching a large number of specific organization goals, while maintaining the flexibility to respond to changing situations and emerging issues. Evaluation professionals are often called upon to assist organizations as they implement complex structural and systemic changes. Assisting with the monitoring of these organizational changes can be difficult and time consuming. However, when done properly, it can promote deeper thought about program goals, theories of change, and achievable outcomes. Working collaboratively with clients to establish and continually refine organizational benchmarks and measurable outcomes (indicators) not only fosters accuracy in the monitoring system, but also promotes stakeholder buy-in and collaboration.

Citation:
Brodersen, R. M., Nearing, K., Connors, S., & Walters, B. (2010, November). The benefits and challenges of participatory tracking systems for monitoring institutional change. Paper presented at the annual meeting of the American Evaluation Association, San Antonio, TX.


Assessing vital signs: Participatory evaluation of a college of nursing’s programs and Improvement process

Susan Connors
Recently two evaluation research groups have offered frameworks to clarify the concept of participatory evaluation. In this article, these frameworks are applied to an active program evaluation at the University of Colorado Denver’s College of Nursing. The application of the metric developed by Daigneault and Jacob (2009) indicated the project had sufficient rating to be considered participatory and the degree of stakeholder participation increased with an external evaluation team. Smits and Champagne’s (2008) model of practical participatory evaluation aligned well with the project, although planning and dissemination components may need to be strengthened in the model, based on this case. Other contextual variables presenting challenges and contributing to the effectiveness of this evaluation are discussed.

Citation:
Connors, S. C. & Magilvy, J. K. (2009, November). Assessing vital signs: Participatory evaluation of a college of nursing’s programs and Improvement process. Paper presented at the American Evaluation Association National Conference, Orlando, FL.


Evaluation of the Quality and Impact of the Education, Training, and Career Development (ETCD) Core Function’s Educational and Mentoring Programs

R. Marc Brodersen
The CCTSI ETCD core function offers four graduate and junior investigator-focused programs that each provide varying degrees of direct instruction and mentoring support. As part of the longitudinal evaluation of these programs, a series of surveys have been developed for both mentors and their associated mentees. These surveys assess mentee satisfaction with the educational programming, as well as the nature, content, and quality of mentoring relationships. Program outcomes are also addressed, including acquisition of clinical and translational core competencies, career satisfaction, and academic persistence. Additionally, mentors are asked about several factors that may influence mentorship effectiveness.

Citation:
Brodersen, R. M. (2009). Evaluation of the quality and impact of the education, training, and career development (ETCD) core function’s educational and mentoring programs. Poster presented at the national evaluation face-to-face meeting of the Clinical and Translational Sciences Awards, Rockville, MD.


From an evaluator’s perspective: Comparison of methods of assessing fidelity in education

Susan Connors
Determining fidelity of implementation is an essential component of an evaluation study that hopes to ‘connect the dots’ between an innovative educational practice and intended outcomes, such as improvement in student achievement, or to compare the effectiveness of an innovation to traditional practices. Panelists will provide insight into current methods of assessing the degree to which innovative instructional strategies are consistently and accurately implemented in classroom practice according to the original design. The merits and challenges of each of the methods will be examined, alternative methods drawn from the literature will be compared, and an interactive discussion concerning the experiences of participants will be facilitated. The purpose of the panel is to advance the evolution and refinement of methods of assessing fidelity of implementation in the field of education.

Citation:
Connors, S. (2008, November). From an evaluator’s perspective: Comparison of methods of assessing fidelity in education. Panel presented to the American Evaluation Association National Conference, Denver, CO.


Cognitive apprenticeship for novice evaluators: Application of theory to practice

Susan Connors
This paper describes the on-going efforts of one university-based evaluation center to apply the model of cognitive apprenticeship (Collins, Brown, & Newman, 1989) to support the development of expertise in novice evaluators. No prior applications of cognitive apprenticeship to the field of evaluation were uncovered in a search of literature. A logic model is presented from a participant observer perspective that aligns cognitive apprenticeship with current practices in the Evaluation Center at the University of Colorado, Denver. Feedback from the team of evaluators who use this approach provides preliminary support for the effectiveness of this model to promote the understanding of the nature of expert practice in professional evaluation.

Citation:
Connors, S. (2007, November). Cognitive apprenticeship for novice evaluators: Application of theory to practice. Paper presented at the American Evaluation Association National Conference, Baltimore, MD.

 
Set your Twitter account name in your settings to use the TwitterBar Section.