Higher Education Evaluations

Teacher Education | Business Education | Medical Education

Active Projects





 
 

Building Research Achievement in Neuroscience (BRAiN)

Evaluation Dates: September 2010 – August 2020

 
The Building Research Achievement in Neuroscience (BRAiN) program provides an exciting opportunity for junior and senior undergraduate students to perform cutting edge neuroscience research in the state-of-the-art research facilities at the Anschutz Medical Campus in Aurora, Colorado and at their home campus, the University of Colorado Denver (UCD) downtown campus or New Mexico State University (NMSU), Las Cruces, New Mexico. The BRAiN program aims to create a network of diverse undergraduate students prepared to pursue a postgraduate education in neuroscience.
Evaluation of the BRAiN project provides program directors with formative and summative data to foster on-going program improvements and to measure program impact. The following questions guide all evaluation activities:

  1. To what extent does participation in BRAiN activities prepare students to successfully apply and be admitted to neuroscience graduate programs?
  2. To what extent does participation in BRAiN activities impact students’ knowledge, skills, and attitudes?
  3. To what extent are program components implemented with quality and fidelity?
  4. What lessons are being learned about effective ways to develop a pipeline of students from underrepresented populations into careers in the neuroscience research field?
  5. In what ways are communications and collaborations strengthened among the affiliate institutions?

The evaluation utilizes a mixed methods approach, including data collection and analysis of program artifacts, student surveys and interviews, mentor surveys and interviews, program observations, and stakeholder interviews. BRAiN is funded through the National Institutes of Health (NIH).


CIBER_logo

 
 

Center for International Business Education and Research (CIBER)

Evaluation Dates: 2007 – On-going

 
The Center for International Business Education and Research seeks to increase the competency of students and faculty in international business at UCD and throughout the western region.

The evaluation of CIBER assesses the impact on the professional practice of faculty participants, the affect on curriculum, and the satisfaction of participants with programs.  Evaluation methods utilized for the CIBER evaluation include: Expert review of curriculum, surveys, grant report review, interviews, and observations.

CIBER is a grant award program through the U. S. Department of Education.




 
 

University of Colorado College of Nursing

Evaluation Dates: August 2007 – On-going

 
The College of Nursing at the University of Colorado Denver (CU Denver) Anschutz Medical Campus currently educates approximately 800 students in undergraduate, masters, and two doctoral degree programs. In alignment with the standards of their accrediting organization, the Commission on Collegiate Nursing Education (CCNE), a full-time faculty of 70 and total faculty of over 100 engages in teaching, research, and nursing practice.

The Evaluation Center provides external evaluation services focused annually to provide leaders of the College of Nursing with current data in the content areas most needed to guide their program improvement and decision-making. Activities include analyses and summary of annual surveys of graduates and alumni; data collection/analyses for accreditation and external reviews; and evaluation of revised curricula, clinical and simulation experiences and other program innovations. Evaluation methods utilized for the College of Nursing evaluation include: online and real-time surveys, interviews, focus groups, and analysis of existing data.



CCTSI_New

 
 

Colorado Clinical & Translational Sciences Institute (CCTSI)

Evaluation Dates: May 2008 – May 2013, October 2013 – May 2018 (Current Round)

 
The primary goal of the CCTSI is to accelerate the process by which new discoveries are made and translated (i.e.,applied) to enhance clinical and community-based practice.  This goal is to be achieved by accomplishing the following directives that the National Institutes of Health (NIH) has set forth for all Clinical Translational Science Award (CTSA) sites:

  1. Eliminate “red tape” (e.g., streamline the human capital necessary to meet regulatory requirements)
  2. Enhance access to the resources necessary to do cutting-edge clinical and translational research (cutting-edge technologies, novel methodological developments, biostatistical and informatics support)
  3. Enhance access to funding, specifically seed money for innovative, high-risk projects and to support junior investigators in establishing a track record of successfully-conducted and fruitful research)
  4. Promote novel collaborations that bring together teams with expertise that spans multiple disciplines and the translational spectrum
  5. Provide training opportunities and career development support to prepare the next generation of researchers with enhanced skills, knowledge and expertise to conduct clinical and translational research

Achieving these directives involves sweeping transformations of the scientific and educational infrastructure/enterprise at the Anschutz Medical Campus and the Institute’s affiliated institutions/organizations.

The primary goals of the evaluation of the CCTSI are:

  1. To conduct formative, process and summative evaluation for the CCTSI
  2. Enhance current evaluative efforts of CCTSI core functions/ components to build evaluative capacity
  3. Participate in and contribute to national CTSA consortia

Evaluation methods utilized in the evaluation of the CCTSI include:  Progress monitoring (involves engaging stakeholders in logic modeling — the development of proximal and distal outcomes, anchored by a long-term vision), case studies and longitudinal assessment of cohorts (e.g., of pilot project and career development awardees), social network analysis and bibliometric analysis, and workflow and cycle-time analyses.  The CCTSI evaluation spans multiple disciplines, involves rigorous methodologies and mixed-methods approaches, and relies heavily on the engagement of stakeholders at multiple levels (from local program managers, to the CCTSI executive leadership, to NIH representatives).  Our team has developed a number of innovative tools and instruments to respond to the demands of evaluating/assessing organizational change across these levels.

The CCTSI is a national grant funded through the National Institutes of Health (NIH).


CUAnschutz_h_clr

 
 

CU Denver AMC Interprofessional Education

Evaluation Dates: September 2014 – On-going

 
Students from all health professions programs participate in the Interprofessional Education (IPE) program at the University of Colorado Anschutz Medical Campus. This longitudinal program includes a two-semester team based learning course, a simulated experience in the Center for Advancing Professional Excellence, and a clinical experience in an interprofessional setting.

The Evaluation Center provides consultation to the IPE evaluation committee, which involves representatives from all health professions schools and IPE leaders. Activities have included logic model development, evaluation design, and administrative and technical support.



SEHD Logo



NxtGEN Teacher Quality Partnership

Evaluation Dates: 2014 – On-going

 
The University of Colorado Denver, Denver Public Schools, and 24 high-need rural schools districts have entered into a partnership to prepare the next generation of teachers. This project, funded by the Department of Education, allows the partners to recruit local diverse students and provide them with customized pathways and support during their teacher preparation and to continue support to new teachers so they can serve in the highest need urban and rural school settings. The project has a clear focus on improving academic achievement and diversifying the teacher workforce.

These questions guide the evaluation of NxtGEN:

  • What is the evidence that the NxtGEN model promotes the recruitment, preparation, and retention of diverse teacher candidates? How do the various program components contribute to the recruitment, preparation and retention of effective teachers?
  • What is the evidence the NxtGEN programs effectively promote academic and career persistence? How do retention rates among program graduates compare to those of state and district teachers?
  • What is the evidence that programs provide customized services and support for licensure, induction, and ongoing professional development in rural districts?
  • What is the effectiveness of online learning in preparing teacher candidates in rural communities?
  • Are participants more effective in raising student test scores than other Colorado public school teachers with the same level of experience but trained through other routes?
  • What is the evidence that the induction model is responsive to individual teachers’ professional growth and social emotional needs? For first and second year teachers, is this model associated with increased effectiveness and persistence?
  • What is the evidence that the partnership with the Learning Assistants program enhances the recruitment, preparation, and retention of STEM teacher candidates? What is the evidence that LAs thrive academically in the program?
  • How does the NxtGEN model increase the capacity of faculty and partner educators to implement national and state-level standards-based reforms?
  •  
    Evaluation methods include: needs assessments, interviews/focus groups, surveys, GIS mapping, rubrics to assess the quality of partnerships, and the review of records including recruitment, graduation, teacher evaluations and assessments, teacher retention, and student achievement data.




     
     
     

    Eastern Colorado Health Care System Geriatric Research Education and Clinical Center (GRECC)

    Evaluation Dates: November 2015 – Ongoing

    The mission of the Eastern Colorado Health Care System (ECHCS) Geriatric Research Education and Clinical Center (GRECC) is to improve the health of older Veterans through the development of innovative research, clinical and educational initiatives. The Evaluation Center supports all three areas: medical and allied health professional training, clinical demonstration projects and the translation of research in two unique focus areas: 1) Consequences and Treatments of Obesity in Older Adults; and 2) Gender Differences in Health (focusing on women’s health). The evaluation of ongoing didactic offerings emphasizes the collection and timely reporting of formative feedback and assessment of outcomes for learners who comprise diverse interprofessional care teams. Clinical demonstration project evaluations support quality and process improvement and examine impact from provider, patient and caregiver perspectives. Finally, evaluation of the research component focuses on the translation (dissemination and implementation) of research products in clinical and community-based practice, research impact/influence (assessed using bibliometrics) and career development of GRECC-supported investigators (e.g., achievement of grant milestones and leadership positions/roles).


    Past Projects

    College of Nursing: Interprofessional Community-Academic Resource Engagement (i-CARE)

    Evaluation Dates: January 2014 – June 2016

    Novel Education Clinical Trainees and Researchers (NECTAR) and Cerebrovascular, Heart Failure, Rheumatic Heart Disease Intervention Strategies (CHRIS)

    Evaluation Dates: September 2010 – August 2015
    Evaluation of NIH funded grant focused on improving medical education at the University of Zimbabwe

    Boettcher Teachers Program

    Evaluation Dates: 2005 – 2012
    Evaluation of teacher preparation program

    Initiative for Maximizing Student Diversity (IMSD)

    Evaluation Dates: 2008 – 2012
    Evaluation of a medical education pipeline program

    National Resource Center for Health and Safety in Child Care and Early Education

    Evaluation Dates: 2010 – 2012
    Evaluation and consultation services for Healthy Weight Initiative

    Culturally Responsive Librarians

    Evaluation Dates: 2008 – 2009
    Course evaluation (University of Colorado Denver School of Education and Human development)

    Wyoming Leadership Academy

    Evaluation Dates: 2006 – 2007
    Evaluation of principal professional development (University of Wyoming)

     

    Comments are closed.

    Set your Twitter account name in your settings to use the TwitterBar Section.