CUR has partnered with Heather Haeger, California State University-Monterey Bay, to serve as CUR’s inaugural Assessment Research Coordinator. Heather Haeger, in coordination with CUR’s Assessment Task Force and Executive Board, has been working to develop a comprehensive set of assessment resources and tools. Please use the links below to learn more about helping to assess undergraduate research on your campus and contributing to CUR’s assessment of the status of undergraduate research.
Assessment Strategies and Ideas
The first step in assessment is defining what (and who) you are counting as participants in undergraduate research, scholarship, and creative inquiry on your campus and setting a consistent practice for tracking participation.
- SPUR: Adapting to Change: Studying Undergraduate Research in the Current Education Environment
- CUR Quarterly: Challenge of the Count
- CUR Quarterly: Assessment of Undergraduate Research
In deciding how to track participation, thinking about what you need to know and what you might assess, consider other items to track in addition to participation:
- Who does research?
- How much and/or for how long?
- With what intensity or rigor?
- Is there funding for undergraduate researchers and their mentors, and, if so, from whom?
- What is the product or outcome of the undergraduate research?
- Where do undergraduate students conduct their research (in-class, out-of-class, or a combination thereof)?
See more examples of how other institutions have tracked and assessed undergraduate research in Scholarship and Practice of Undergraduate Research (SPUR) and the following literature reviews:
- Assessing the Impact of Undergraduate Research Experiences on Students: An Overview of Current Literature
- Assessing Undergraduate Research Experiences: An Annotative Bibliography
- Pick a program and an outcome you can assess now by using data you have or can easily access.
- Plan ahead for data collection for the future; for example, how can you track students now so you can measure outcomes for alumni?
- What resources do you have in terms of personnel, financial investment, and existing data, etc.?
- What is your time frame for collecting data and completing the analysis?
- What relationships and collaborations can you build through this assessment project such as faculty who are interested in publishing on the science of teaching and learning, or members of the Institutional Review Board or Institutional Assessment and Research offices?
- What is the purpose and the audience of the assessment, and how do you communicate your findings (e.g., a report to campus leadership or a publishable research project to create generalizable findings)?
- See a review of questions that have been reviewed in literature on UR in CUR Quarterly: Assessing the Impact of Undergraduate Research Experiences on Students: An Overview of Current Literature and the report from the National Academies of Sciences, Engineering, and Medicine on the need for more rigorous research on the impact of undergraduate research and particularly for the need for greater attention to questions about equity in undergraduate research participation.
- Steps to create a research question:
- Outline the learning outcomes, skills, or benefits you hope your students are getting out of their research experience.
- Which of these can you measure with data you already have? Create a question you can answer now based on those data.
- Which outcomes can you find ways to measure in the future? Create longer-term questions for those.
- Get ideas for what to assess and how to do it from the CUR Community in the CUR Member Forum, see past discussions on assessment instruments and strategies, or ask your own question.
Surveys
- Look in the literature in your discipline for measures of content-specific knowledge or concept inventories
- Review literature in psychology, education, and social sciences for measures of learning and development. Some examples include the following:
- Measures related to student and faculty perspectives on skill development and learning gains, such as EvaluateUR
- Measures related to STEM learning and identity through the STEM Learning and Research Center
- Measures of motivations and aspirations in Self-Determination Theory
- Multiple measures of self-efficacy and critical thinking
- Only ask for what you’ll actually use to keep surveys short.
- Ask about undergraduate researcher behavior or content knowledge when possible.
- Self-reported learning when appropriate (See Pike, G. 2011. “Using College Students’ Self-Reported Learning Outcomes in Scholarly Research.” New Directions for Institutional Research, 150: 41–58.)
- Comparisons between treatments (e.g., students who did undergraduate research compared to those who did not).
- Relationships between items (e.g., the relationship between reporting feeling supported by a faculty mentor and self-reported learning).
- When compared to another measure to increase validity (e.g., is self-reported learning related to increases in GPA, retention, graduation rates, or faculty evaluation of learning).
Qualitative Data
- AAC&U Value Rubrics
- Research Skill Development (RSD): Rubric to evaluate student research skills
Quantitative and Institutional Data
- If you don’t need it, don’t collect personal identifiers. If you are not going to match this survey, interview, or focus group to other sources of data in the future, do not collect student names or identifiers so there is no risk of identification.
- Utilize previously established educational practices, including undergraduate research or course based-research. If your intervention is something new that has never been tested before, you will need IRB oversight. If you are testing the implementation of something that has been previously tested like active learning, a hands-on project, or a project-based learning initiative, your research is more likely to be exempt.
- Utilize data from traditional educational activities or student products (e.g., research papers or course exams)
- If your research is a case study or is primarily to inform quality improvement activities for your program or university, it is more likely to be exempt.
Assessment Toolkit
CUR’s Assessment Task Force is creating an Assessment Toolkit for CUR members to use. The following are available resources:
CUR Position Statement (2019)
Undergraduate Research: A Road Map for Meeting Future National Needs and Competing in a World of Change
Joanne D. Altman, Tsu-Ming Chiang, Christian S. Hamann, Huda Makhluf, Virginia Peterson, and Sara E. Orel
Open-Source Assessment Bibliography (2019)
Prior research assessing undergraduate research can be found in the following Zotero library: Assessment.
Use the tags in Zotero, located on the left side of the screen to sort articles by topic. Tags starting with “*CUR_” such as *CUR Student Outcomes are tags created by CUR
Survey Instruments Used to Assess Undergraduate Research
Critical-Thinking Assessment Test (CAT): Assesses a broad range of skills that faculty across the country feel are important components of critical thinking and real-world problem solving. Questions are derived from real-world situations, most requiring short-answer essay responses. Designed to engage faculty in the assessment and improvement of students’ critical thinking.
CUREnet Assessment: A collection of resources for assessment of course-based research experiences.
Evaluate UR: Evaluating Undergraduate Research: To be administered to both students and faculty mentors before, during, and after the research experience. Intended for arts and humanities as well as STEM. All surveys are available on the website.
Higher Education Research Institute (HERI): Hosts multiple surveys for students and faculty. The 2016 CIRP College Senior Survey added new sets of questions asking students about their relationship with science, likelihood of pursuing a science-related career, and confidence in their science and research-related skills. See research using HERI surveys.
National Survey of Student Engagement (NSSE) and Faculty Survey of Student Engagement (FSSE). Surveys administered online through Indiana University’s Center for Postsecondary Research. Universities pay to have it administered to their students.
The Project Ownership Survey: Survey measuring differences in scientific inquiry experiences (Hanauer and Dolan). Contains survey with 18 scales from 30 items to help measure emotions of an experience. It is called the Project Ownership Survey (POS) and helps determine students’ feelings of ownership of a project.
Student Experience in the Research University (SERU): The mission of the SERU Project is to help improve the undergraduate experience and educational processes by generating new, longitudinal information on the undergraduate experience at research universities – via an innovative survey – to be used by administrators, policymakers, and scholars. Must be part of SERU consortium. Find research using SERU.
Undergraduate Research Student Self-Assessment (URSSA): Developed by Hunter, Weston, Thiry, and Laursen at University of Colorado, Boulder. Described in the spring 2009 CUR Quarterly on assessment. The questions are listed on the website, but the actual survey is administered through the Student Assessment of Learning Gains (SALG) website. See studies using URSSA.
Webinars
Assessment Strategies for Course-Based Undergraduate Research Experiences (June 13, 2018)
Surveying the Impact: Using Survey Methods to Assess Undergraduate Research (December 13, 2017)