In 1997, Rose-Hulman began defining a set of campus-level student learning outcomes. This process began with an extensive review of the literature and best practices in higher education. The institution then sought input from faculty, staff, students, and other stakeholders (e.g., alumni, industry professionals) to ensure these outcomes aligned with the institutional mission and values. By the end of the 1997-1998 academic year, Rose-Hulman faculty approved a set of 10 campus-level student learning outcomes. These outcomes and the approaches to assessment of student learning were instrumental in successful 2000 and 2006 ABET accreditation cycles, as well as a Higher Learning Commission visit in 2007.
After the 2006 ABET accreditation cycle, Rose-Hulman conducted a thorough review of campus-level student learning outcomes. Again, this included an extensive review of the literature and best practices, as well as input from faculty, staff, students, and other stakeholders. These campus-level student learning outcomes have been in use since 2007 when they were approved by the faculty.
In 1997, in addition to defining campus-level student learning outcomes, Rose-Hulman developed a comprehensive campus-level assessment plan that allowed the institution to focus on continuous improvement and to meet the demands of program and institutional accrediting bodies. The cornerstone of this plan included the creation of the Rose-Hulman electronic portfolio system (REPS, now referred to as RosEval). RosEval allows for the direct assessment of authentic student coursework in communication, cultural and global awareness, ethics, and teamwork. In 1998, RosEval was first used to evaluate a set of student submissions as a pilot. Every year since then, Rose-Hulman has used RosEval to collect, evaluate, and report achievement in campus-level student learning outcomes to students, faculty, employers, graduate schools, and various program and institutional accrediting bodies.
Faculty and staff can access various assessment resources. This includes a glossary of assessment terms, a list of student learning outcomes and performance criteria, rubrics, reports, and links to relevant websites. Often, the Office of Institutional Research, Planning and Assessment also facilitates workshops in collaboration with the Center for the Practice and Scholarship of Education. However, as a small institution with a team of assessment experts, Rose-Hulman places great emphasis on the human resources provided to faculty and staff in order to support their assessment efforts. Each year, the Director of Assessment consults with faculty and staff about their assessment needs. In most cases, the Director collaborates with these faculty and staff in developing and administering survey instruments, designing and facilitating interviews/focus groups, and analyzing data to help faculty and staff better understand student learning. To schedule a meeting, please contact Tony Ribera.
In addition to the RosEval process, Rose-Hulman engages in many other campus-level assessment activities. This includes, but is not limited to, administering the Index of Learning Styles, participating in the National Survey of Student Engagement, and facilitating student focus groups.
Results from indirect and direct assessment measures are distributed to faculty and staff. For example, departmental results from the National Survey of Student Engagement (NSSE) were distributed to internal stakeholders via email. Using the Report Builder feature on the NSSE interface, departmental reports were created and shared with department heads with a large enough sample size. These reports included an overview of the instrument, institutional comparisons, departmental mean and frequency comparisons, as well as resources for departments. IRPA staff adapted the NSSE 2015 Departmental Worksheet from the NSSE User’s Guide to better assist departments in their efforts.
In 2015, Rose-Hulman faculty responded to the Faculty Survey of Assessment Culture, a survey administered by Sam Houston State University. More than 75% of Rose-Hulman faculty respondents either agreed or strongly agreed that “assessment data are used to identify the extent to which student learning outcomes are met.” This can be seen at the program level with programs evaluating RosEval results and making changes to their own curricula during annual retreats. Department heads then work with IRPA and CASO to address gaps.
Rose-Hulman has used data to improve campus-level assessment efforts, for example, how we assess teamwork. In 1998, the institution began by requiring teams to submit a report from a group project and then using that artifact to assess student learning. Then, thinking the artifact focused more on the product of the team’s work rather than on the students’ process of working together as a team, faculty members required students to submit minutes from one of their group meetings. These minutes were meant to show the specific process the team used to decide among several design alternatives. Feeling as though gaps still remained, faculty members began requiring students to record and submit videos of group meetings. An evaluation rubric was developed in 2013 based on the literature and tested by CASO. This allowed raters to observe students engaging in teamwork and authentically assess their performance.