Higher Education Evaluations
Teacher Education | Business Education | Medical Education
The African Center for the Advancement of Research Excellence (AfriCARE)
Evaluation Dates: September 2017- August 2019
The African Center for the Advancement of Research Excellence (AfriCARE) is a two-year planning grant to develop a center for high quality and high impact medical research at the University of Zimbabwe College of Health Sciences. The grant from the National Institutes of Health and the National Cancer Institute was awarded to the University of Colorado. A member of The Evaluation Center team serves on the Executive Committee and co-chairs the Assessment and Monitoring Program with local evaluators in Africa. The work of evaluators is to:
- Conduct a needs assessment to determine local priorities for medical research in the areas of cancer and cardiovascular diseases;
- Review international research needs to identify potential future research content and methods;
- Develop metrics to monitor the planning process and to measure research impact;
- Provide formative feedback; and
- Disseminate results.
Building Research Achievement in Neuroscience (BRAiN)
Evaluation Dates: September 2010 – August 2020
The Building Research Achievement in Neuroscience (BRAiN) program provides an exciting opportunity for junior and senior undergraduate students to perform cutting edge neuroscience research in the state-of-the-art research facilities at the Anschutz Medical Campus in Aurora, Colorado and at their home campus, the University of Colorado Denver (UCD) downtown campus or New Mexico State University (NMSU), Las Cruces, New Mexico. The BRAiN program aims to create a network of diverse undergraduate students prepared to pursue a postgraduate education in neuroscience.
Evaluation of the BRAiN project provides program directors with formative and summative data to foster on-going program improvements and to measure program impact. The following questions guide all evaluation activities:
- To what extent does participation in BRAiN activities prepare students to successfully apply and be admitted to neuroscience graduate programs?
- To what extent does participation in BRAiN activities impact students’ knowledge, skills, and attitudes?
- To what extent are program components implemented with quality and fidelity?
- What lessons are being learned about effective ways to develop a pipeline of students from underrepresented populations into careers in the neuroscience research field?
- In what ways are communications and collaborations strengthened among the affiliate institutions?
The evaluation utilizes a mixed methods approach, including data collection and analysis of program artifacts, student surveys and interviews, mentor surveys and interviews, program observations, and stakeholder interviews. BRAiN is funded through the National Institutes of Health (NIH).
Center for International Business Education and Research (CIBER)
Evaluation Dates: 2007 – On-going
The Center for International Business Education and Research seeks to increase the competency of students and faculty in international business at UCD and throughout the western region.
The evaluation of CIBER assesses the impact on the professional practice of faculty participants, the affect on curriculum, and the satisfaction of participants with programs. Evaluation methods utilized for the CIBER evaluation include: Expert review of curriculum, surveys, grant report review, interviews, and observations.
CIBER is a grant award program through the U. S. Department of Education.
Colorado Clinical & Translational Sciences Institute (CCTSI)
Evaluation Dates: May 2008 – May 2013, October 2013 – May 2018 (Current Round)
The primary goal of the CCTSI is to accelerate the process by which new discoveries are made and translated (i.e.,applied) to enhance clinical and community-based practice. This goal is to be achieved by accomplishing the following directives that the National Institutes of Health (NIH) has set forth for all Clinical Translational Science Award (CTSA) sites:
- Eliminate “red tape” (e.g., streamline the human capital necessary to meet regulatory requirements)
- Enhance access to the resources necessary to do cutting-edge clinical and translational research (cutting-edge technologies, novel methodological developments, biostatistical and informatics support)
- Enhance access to funding, specifically seed money for innovative, high-risk projects and to support junior investigators in establishing a track record of successfully-conducted and fruitful research)
- Promote novel collaborations that bring together teams with expertise that spans multiple disciplines and the translational spectrum
- Provide training opportunities and career development support to prepare the next generation of researchers with enhanced skills, knowledge and expertise to conduct clinical and translational research
Achieving these directives involves sweeping transformations of the scientific and educational infrastructure/enterprise at the Anschutz Medical Campus and the Institute’s affiliated institutions/organizations.
The primary goals of the evaluation of the CCTSI are:
- To conduct formative, process and summative evaluation for the CCTSI
- Enhance current evaluative efforts of CCTSI core functions/ components to build evaluative capacity
- Participate in and contribute to national CTSA consortia
Evaluation methods utilized in the evaluation of the CCTSI include: Progress monitoring (involves engaging stakeholders in logic modeling — the development of proximal and distal outcomes, anchored by a long-term vision), case studies and longitudinal assessment of cohorts (e.g., of pilot project and career development awardees), social network analysis and bibliometric analysis, and workflow and cycle-time analyses. The CCTSI evaluation spans multiple disciplines, involves rigorous methodologies and mixed-methods approaches, and relies heavily on the engagement of stakeholders at multiple levels (from local program managers, to the CCTSI executive leadership, to NIH representatives). Our team has developed a number of innovative tools and instruments to respond to the demands of evaluating/assessing organizational change across these levels.
The CCTSI is a national grant funded through the National Institutes of Health (NIH).
Eastern Colorado Health Care System Geriatric Research Education and Clinical Center (GRECC)
Evaluation Dates: November 2015 – Ongoing
The mission of the Eastern Colorado Health Care System (ECHCS) Geriatric Research Education and Clinical Center (GRECC) is to improve the health of older Veterans through the development of innovative research, clinical and educational initiatives. The Evaluation Center supports all three areas: medical and allied health professional training, clinical demonstration projects and the translation of research in two unique focus areas: 1) Consequences and Treatments of Obesity in Older Adults; and 2) Gender Differences in Health (focusing on women’s health). The evaluation of ongoing didactic offerings emphasizes the collection and timely reporting of formative feedback and assessment of outcomes for learners who comprise diverse interprofessional care teams. Clinical demonstration project evaluations support quality and process improvement and examine impact from provider, patient and caregiver perspectives. Finally, evaluation of the research component focuses on the translation (dissemination and implementation) of research products in clinical and community-based practice, research impact/influence (assessed using bibliometrics) and career development of GRECC-supported investigators (e.g., achievement of grant milestones and leadership positions/roles).
Environmental Stewardship of Indigenous Lands (ESIL)
Evaluation Dates: January 2018 – December 2019
The University of Colorado Denver (CU-Denver) has been awarded a National Science Foundation grant to develop and implement a certificate program to prepare tribal liaisons to improve the environmental stewardship of Indigenous lands (ESIL). This pilot phase will include the development of a network of partners using the collective impact initiative principles.
The Evaluation Center is assessing how the partner network is developing and working by gathering feedback through key informant interviews, observations, and artifact review. We use a specifically tailored rubric for the ESIL project to assess the functioning of the network, as well as, story-telling to facilitate qualitative data collection.
The evaluation examines the extent to which
- Partners have a common agenda;
- Backbone infrastructure, facilitates the planned work; and
- Mutually reinforcing activities are implemented
Further, we are assessing ways in which partners are using shared measurements and continuous communication supports progress. Performance indicators will be established to better understand ESIL participants’ motivation, satisfaction with certificate program components, and anticipated career trajectory; and what we can learn from existing data related to the ESIL certificate program.
We use participatory methods with program leaders at all phases of the evaluation to ensure all voices are heard and that the evaluation aligns with best practices in cultural responsiveness during data collection and analysis.
Evaluation Dates: Fall 2016 – On-going
The National Science Foundation launched I-Corps in 2011 to provide entrepreneurship training for NSF-funded scientists and engineers, pairing them with business mentors for an intensive 7-week, team-immersive program focused on discovering a customer-driven path from their lab to the marketplace. Through a collaborative effort between the National Institutes of Health (NIH) and NSF, the program has been adapted to support the commercialization of academic biomedical technologies, with the Clinical and Translational Science Award sites serving as the dissemination network.
After being selected as one of nine CTSA hubs for the NIH pilot, the I-Corps@CCTSI program was launched in Fall 2016. The Evaluation Center was engaged to create an evaluation plan based on a comprehensive logic model, develop, pilot and refine evaluation instruments/approaches, assess the effectiveness of training components and longitudinally follow the progress of teams in moving their products through the commercialization process. The diversity of teams and innovations pose a unique challenge for the evaluation, catalyzing novel evaluation approaches, such as the use of documentary to capture the evolution of innovations, teams and scientists who are shaped by their I-Corps experience.
CU Denver AMC Interprofessional Education
Evaluation Dates: September 2014 – On-going
Students from all health professions programs participate in the Interprofessional Education (IPE) program at the University of Colorado Anschutz Medical Campus. This longitudinal program includes a two-semester team based learning course, a simulated experience in the Center for Advancing Professional Excellence, and a clinical experience in an interprofessional setting.
The Evaluation Center provides consultation to the IPE evaluation committee, which involves representatives from all health professions schools and IPE leaders. Activities have included logic model development, evaluation design, and administrative and technical support.
NxtGEN Teacher Quality Partnership
Evaluation Dates: 2014 – On-going
The University of Colorado Denver, Denver Public Schools, and 24 high-need rural schools districts have entered into a partnership to prepare the next generation of teachers. This project, funded by the Department of Education, allows the partners to recruit local diverse students and provide them with customized pathways and support during their teacher preparation and to continue support to new teachers so they can serve in the highest need urban and rural school settings. The project has a clear focus on improving academic achievement and diversifying the teacher workforce.
These questions guide the evaluation of NxtGEN:
- What is the evidence that the NxtGEN model promotes the recruitment, preparation, and retention of diverse teacher candidates? How do the various program components contribute to the recruitment, preparation and retention of effective teachers?
- What is the evidence the NxtGEN programs effectively promote academic and career persistence? How do retention rates among program graduates compare to those of state and district teachers?
- What is the evidence that programs provide customized services and support for licensure, induction, and ongoing professional development in rural districts?
- What is the effectiveness of online learning in preparing teacher candidates in rural communities?
- Are participants more effective in raising student test scores than other Colorado public school teachers with the same level of experience but trained through other routes?
- What is the evidence that the induction model is responsive to individual teachers’ professional growth and social emotional needs? For first and second year teachers, is this model associated with increased effectiveness and persistence?
- What is the evidence that the partnership with the Learning Assistants program enhances the recruitment, preparation, and retention of STEM teacher candidates? What is the evidence that LAs thrive academically in the program?
- How does the NxtGEN model increase the capacity of faculty and partner educators to implement national and state-level standards-based reforms?
Evaluation methods include: needs assessments, interviews/focus groups, surveys, GIS mapping, rubrics to assess the quality of partnerships, and the review of records including recruitment, graduation, teacher evaluations and assessments, teacher retention, and student achievement data.
Promoting Excellence in Research and Faculty Enhanced Career Training (PERFECT)
Evaluation Dates: September 2015 – August 2020
Promoting Excellence in Research and Faculty Enhanced Career Training (PERFECT) is a grant program awarded to the University of Zimbabwe College of Health Sciences by the National Institutes of Health. Over five years, the program intends to build the medical research capacity of 30 junior faculty trainees in seven target scientific areas that represent health concerns in Zimbabwe.
For the PERFECT grant, members of The Evaluation Center serve as consultants to the local evaluators in Zimbabwe, building upon the evaluation capacity development conducted under previous grants (see NECTAR description on this website). Evaluation mentorship includes review of evaluation designs, data collection and analyses methods, and reporting.
University of Colorado College of Nursing
Evaluation Dates: August 2007 – June 2017
College of Nursing: Interprofessional Community-Academic Resource Engagement (i-CARE)
Evaluation Dates: January 2014 – June 2016
Novel Education Clinical Trainees and Researchers (NECTAR) and Cerebrovascular, Heart Failure, Rheumatic Heart Disease Intervention Strategies (CHRIS)
Evaluation Dates: September 2010 – August 2015
Evaluation of NIH funded grant focused on improving medical education at the University of Zimbabwe
Boettcher Teachers Program
Evaluation Dates: 2005 – 2012
Evaluation of teacher preparation program
Initiative for Maximizing Student Diversity (IMSD)
Evaluation Dates: 2008 – 2012
Evaluation of a medical education pipeline program
National Resource Center for Health and Safety in Child Care and Early Education
Evaluation Dates: 2010 – 2012
Evaluation and consultation services for Healthy Weight Initiative
Culturally Responsive Librarians
Evaluation Dates: 2008 – 2009
Course evaluation (University of Colorado Denver School of Education and Human development)
Wyoming Leadership Academy
Evaluation Dates: 2006 – 2007
Evaluation of principal professional development (University of Wyoming)