URSSA FAQs
An overview of the instrument and its use
URSSA is the Undergraduate Research Student Self-Assessment, an online survey instrument for programs and departments to use in assessing the student outcomes of undergraduate research (UR). URSSA focuses on what students learn from their UR experience, rather than whether they liked it.
The self-assessment includes both multiple-choice and open-ended items that focus on students’ gains from undergraduate research. These gains include:
- skills such as lab work and communication
- conceptual knowledge and linkages in their field
- deeper understanding of the intellectual and practical work of science
- growth in confidence and adoption of the identity of scientist
- preparation for a career or graduate school in science
- greater clarity in understanding what career or educational path students might wish to pursue.
Other items probe students’ participation in important research-related activities that have been shown to lead to these gains (e.g. giving presentations, having responsibility for a project). These activities, and the gains themselves, are based in research and thus constitute a core set of items. Using these items as a group helps to align a particular program assessment with research-demonstrated outcomes.
In addition, optional items can be included to probe particular features that are included along with research in a UR program (e.g. field trips, career seminars, housing arrangements).
URSSA is a tool for measuring students’ self-reported gains from their research experience. Our research shows that students are very capable of noticing their own growth—and also where they have grown little or not at all. Student self-report is not the only measure of the success of a UR experience, but it is an important component. URSSA measures some outcomes, such as growth in confidence, or the decision to become a scientist, that only students can tell us about. We encourage faculty and departments to use URSSA as one part of a more comprehensive evaluation plan that addresses all their program goals and outcomes.
The survey questions (“items”) in URSSA are based on our group’s extensive, interview-based research and evaluation work on undergraduate research. This work includes:
- an eight-year study of undergraduate research at four liberal arts colleges, with over 350 interviews;
- evaluation studies of UR programs at two research universities and one national laboratory, totaling another 350 interviews and survey responses from over 150 students; and
- an extensive literature review aligning all well-designed, published research and evaluation studies of UR.
This grounding in research means that URSSA measures things we know to be important—it “asks the right questions.”
Once initial survey items were developed based on this body of research, they were tested with students in “think-aloud” interviews to see if students interpreted the wording as we intended; the items were then refined and tested again.
With this refined version of URSSA, we solicited help from faculty and UR program directors to gather a large student data set. This pilot study included over 500 students in 24 colleges and universities, which enabled us to conduct statistical tests of the items’ validity and reliability. Using Confirmatory Factor Analysis, we compared how student responses fit the hypothesized structure of the survey and found that the data met accepted standards for model fit. We also tested survey items to learn if they functioned as we intended. Based on these results, some survey items were removed from the survey or changed when they did not meet our criteria for acceptable item functioning.
Optional items about UR program elements were developed in consultation with UR program developers and department UR leaders.
After URSSA had been in the field for some time, data from over 3400 students was used to carry out a statistical validation study. See
Weston, T. J., & Laursen, S. L. (2015).Ěý .Ěý CBE-Life Sciences Education,Ěý14(3), ar33. DOI 10.1187/cbe.14-11-0206Â
The URSSA team includes people with a broad range of applicable expertise.ĚýAnne-Barrie Hunter,ĚýSandra Laursen, and Heather Thiry are education researchers who specialize in qualitative research, especially interviews. With other colleagues, they have studied several undergraduate research programs as researchers and program evaluators since 1999. Laursen is also a chemist who has conducted laboratory research both as an undergraduate and as a research advisor to undergraduates.ĚýTimothy Weston is an expert in student assessment, survey development, and quantitative research in education.
The value of undergraduate research (UR) as part of a student’s science education has indeed been long known to the faculty who work with UR students—but only recently have the outcomes of UR been documented in well-designed research and evaluation studies. Research also suggests that there are many different paths by which to develop effective UR programs, and that good assessment data can help programs refine what they offer to optimize the UR experience for students.
Science departments and UR programs at universities and labs need well-designed, inexpensive evaluation tools so that they can assess student outcomes, improve their programs, and inform their stakeholders. Funding agencies also need these tools to measure the impact of their efforts, for example in examining innovations.
URSSA is
- flexible—URSSA is constructed with a set of core items on student gains from UR. Other items probe students’ participation in important activities that help to secure these gains. Because these core items are based on research, they must be used as a set.ĚýOptional items can be selected or added to customize your survey to probe students’ experiences of specific program elements that you have added to the basic research experience, such as speakers or field trips.
- free – There is no charge to use URSSA.
- easy to use—Numeric results are available as raw data, summary statistics, cross-tabs, and graphs. Users can also download all responses in a pre-formatted Excel file. These are standard features of the SALG platform through which URSSA is delivered.
- refined by user input—URSSA has been methodically developed, with input and feedback from UR program directors and departmental leaders. We conducted extensive testing and revision through “think-aloud” interviews with students to make sure they understood the questions the same way we intended.
- piloted with students—URSSA was tested with over 500 UR students at 24 colleges and universities nationwide, in summer 2008.
- validated—URSSA has high content validity because it is based on careful research and evaluation data. In addition, student responses from the large pilot study were used to conduct statistical tests of validity and reliability. Using Confirmatory Factor Analysis, we compared how student responses fit the hypothesized structure of the survey and found that the data met accepted standards for model fit. Items that did not meet these standards were removed. Pilot studies show that URSSA sensitively detects patterns of difference in students’ experiences and in programs. See this link for more details about the .
For 15 years, URSSA was delivered through the web platform developed for the Student Assessment of their Learning Gains (SALG), an online instrument for assessing students’ learning gains from college science classes. Using salgsite's online tools, UR programs or departments could set up a customized copy of URSSA and generate a link to send to students to complete the survey. As of July 2024, that platform is no longer available, due to concerns about information security.Ěý
We suggest that people mount the URSSA questions on a survey tool of their choice, such as Qualtrics. A text version of the questions is available here, with information about how to set up the survey in Qualtrics. Contact Tim Weston, WestonT AT Colorado DOT edu for a QSF file that helps with this task (we cannot mount this file type here for download).
To protect the confidentiality of students’ responses and the trust between students and their research advisor, URSSA should not be used with groups fewer than 10 students. URSSA is designed for evaluation use by departments and programs, not by individual research advisors.
We encourage research advisors to gather feedback from their own students about their research experience. We suggest that you meet with your research students as a group for informal conversation over lunch or a snack. Ask your students what they have gained from doing research—wait to see what they discuss spontaneously, then perhaps explore the broad gains areas from URSSA that interest you most. Other good questions to ask students are what was the “best” thing about their research experience, what could be improved, and what surprised them about doing research.
If you report findings from URSSA in a presentation or publication, please cite the instrument:
URSSA, Undergraduate Research Student Self-Assessment (2009). Ethnography & Evaluation Research, ˛ĘĂń±¦µä, Boulder, CO.Ěý
If you modify URSSA items for your own purposes, please cite the instrument and explain the changes you have made, as these may alter the validity of the instrument or meaning of the items.