April Newsletter Archives - Office of Institutional Research and Assessment /oira/category/april-newsletter/ Just another University of Maine Sites site Wed, 08 Apr 2020 13:29:01 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.5 Does giving class-time improve SET response rates? /oira/does-giving-class-time-improve-set-response-rates/ /oira/does-giving-class-time-improve-set-response-rates/#respond Tue, 07 Apr 2020 18:23:15 +0000 /oira/?p=2280 Last month we surveyed all faculty who used Blue for online student evaluations of teaching (SET) during the fall 2019 semester regarding their use of class time for completing their online SETs. Because results from published studies at other institutions suggest a positive relationship between providing in-class time and response rates, we sought to quantify […]

The post Does giving class-time improve SET response rates? appeared first on Office of Institutional Research and Assessment.

]]>
Last month we surveyed all faculty who used Blue for online student evaluations of teaching (SET) during the fall 2019 semester regarding their use of class time for completing their online SETs. Because results from published studies at other institutions suggest a positive relationship between providing in-class time and response rates, we sought to quantify the magnitude of the relationship at 91, if one exists. We received responses for 168 unique course sections (thank you to all those who responded!). We limited our analysis to in-person sections with 10 or more students enrolled, in an attempt to avoid extreme response rates that are more likely in low-enrollment courses. After applying the course and enrollment filters we were left with 142 unique course sections. The percentage of these sections where in-class time was given (“yes”) or was not given (“no”) to complete online SETs are indicated in Figure 1.

Figure 1: Proportion of course sections where instructors gave students in-class time to do their online SETs (yes), versus those that were not (no).

We compared the section response rates of these two groups. The distribution of response rates within the groups are shown in Figure 2. The histograms suggest that response rates were better if students were given in-class time (albeit with a lot of overlap).

Figure 2: Distribution of response rates for course sections where students were given in-class time to do their online SETs (Yes) and those that were not (No).


The mean response rates for the two groups are provided in Table 1. We found the mean difference between the two groups to be 15.1 percentage points. This difference is statistically significant (Welch’s T-Test(83.3)=4.369, p<0.000) and indicates a 15.1 percentage point improvement in response rates for class sections that provided students in-class time to do their SETs. It should be noted that our survey did not include questions about other strategies being used instead of or in addition to providing class time, and these results only apply to the select subset of faculty who responded to the survey. That said, these results do suggest that providing class time can positively impact response rates.

Table 1: Mean response rates for course sections where students were and were not given in-class time to do their online SETs.

Our results are in agreement with published literature that suggests giving in-class time increases response rates, but we acknowledge that additional study is needed to understand what other incentives or strategies are used and how they could impact response rates. Toward that end, we asked the faculty of those courses that did not give in-class time, but also had response rates at or above 70%, to describe their approaches. A summary of strategies and incentives used by those faculty who responded are provided in Table 2. Using such strategies as reminding students, conducting a mid-term feedback survey, and showing students how the feedback is used, in addition to giving in-class time for SETs, might result in even greater improvements in response rates.

Another, somewhat unexpected, result was the number of low response rates for sections where in-class time was given for online evaluations to be filled out. We do not have data to address what might have been happening in these situations, but it does imply that simply giving in-class time might not always be enough to result in an adequate number of responses. We recommend that faculty consider using the strategies described in Table 2 regardless of the way SETs are being administered. We provide similar recommendations on the OIRA website.

 

Table 2: Strategies and incentives used by faculty that had high SET response rates but did not give in-class time for the evaluations. Note: Please check with your department before giving points as an incentive.

The post Does giving class-time improve SET response rates? appeared first on Office of Institutional Research and Assessment.

]]>
/oira/does-giving-class-time-improve-set-response-rates/feed/ 0
Personalizing Questions for Online Student Evaluations of Teaching /oira/personalizing-questions-for-online-student-evaluations-of-teaching/ /oira/personalizing-questions-for-online-student-evaluations-of-teaching/#respond Tue, 07 Apr 2020 18:20:43 +0000 /oira/?p=2278 It may be hard to believe but the time for student evaluations of teaching (SET) is fast approaching and with that comes the time for faculty to add personalized questions to their online questionnaire in Blue. Question personalization (QP) is a straightforward task and your opportunity to add up to 10 questions to the questionnaire. […]

The post Personalizing Questions for Online Student Evaluations of Teaching appeared first on Office of Institutional Research and Assessment.

]]>
It may be hard to believe but the time for student evaluations of teaching (SET) is fast approaching and with that comes the time for faculty to add personalized questions to their online questionnaire in Blue. Question personalization (QP) is a straightforward task and your opportunity to add up to 10 questions to the questionnaire. Questions added during QP only appear on the instructors’ own report summaries and are for informational purposes only. Approximately four weeks before the end of classes, faculty will receive an email that includes instructions and a link to Blue to add questions. Two weeks are given to complete this task. We recommend having your questions prepared before you follow the QP link to add them in Blue. The 91 Orono Standard 19 questions are available on the OIRA website. Please do not duplicate these questions as students will have already answered them earlier in the questionnaire.

Faculty are able to add up to seven scaled (Likert) questions and three open-ended comment style questions for each course that they teach. To aid in this process, particularly if the same questions are being added for each course a faculty member is teaching, we recommend writing them in a Word document and then copying/pasting them into the space available for each course. CITL provides some recommendations and additional resources for using personalized questions to shift the focus from instructor performance to the learners’ experience. These include questions such as:

  • How would the student rate their own ability/confidence?
  • What will the student remember in 5 years?
  • How did the learning activities support learning?

As always, we are here to help, so please reach out to Lisa Henderson (lisa.henderson@maine.edu ) or Ryan Weatherbee (ryan.a.weatherbee@maine.edu) if you have any additional questions about QP or other aspects of online SET.

The post Personalizing Questions for Online Student Evaluations of Teaching appeared first on Office of Institutional Research and Assessment.

]]>
/oira/personalizing-questions-for-online-student-evaluations-of-teaching/feed/ 0
Assessment in the time of COVID-19 /oira/assessment-in-the-time-of-covid-19/ /oira/assessment-in-the-time-of-covid-19/#respond Tue, 07 Apr 2020 18:17:43 +0000 /oira/?p=2275 Assessment, in its simplest form, is all about asking questions. When we perform assessments in the classroom, we are asking students questions so they can show us what they have learned. When we do program assessment, we ask bigger questions about our programs: How are students meeting the stated program outcomes? How does our course […]

The post Assessment in the time of COVID-19 appeared first on Office of Institutional Research and Assessment.

]]>
Assessment, in its simplest form, is all about asking questions. When we perform assessments in the classroom, we are asking students questions so they can show us what they have learned. When we do program assessment, we ask bigger questions about our programs: How are students meeting the stated program outcomes? How does our course sequencing and choice of formative assessments affect these outcomes?

During our adjustment to life with COVID-19, there has been a tremendous amount of informal assessment work being done. Without our business-as-usual routines and ways of doing things, we have all been forced to reconsider what’s most important to our courses and programs and how we can adjust to make sure the learning outcomes are met. Many of the daily assessments might look different, but the important work of assessment…of asking questions about our programs and looking for the answers…is happening all over. Because of this, we can continue to offer our students strong and responsive academic programs.

To continue to assist you during this time, the “assessment side” of the Office of Institutional Research and Assessment is working hard to build out a portfolio of workshops offered for assessment-related activities. Two workshops on learning outcomes took place in March. The purpose of the workshops was to give attendees an overview of learning outcomes at the program and course level and allow time for them to practice developing their own. A copy of the slides and handout from these workshops can be found on our website.

A recording of our most recent workshop on curriculum mapping can be found on our website. Curriculum mapping is a graphical way to visualize where program outcomes are being introduced, reinforced and mastered throughout a program. Creating a curriculum map is usually the second step, after developing program outcomes, in a program’s assessment plan. Once created, the map provides a way to identify gaps and redundancies within a program of study and therefore allows for more purposeful planning to maximize program effectiveness. In this workshop we will review the steps needed to create a curriculum map and discuss how to use it to inform curricular review.

We will continue to create and offer assessment workshops. If there are specific program assessment topics that need to be addressed within your unit, we also offer tailored workshops for smaller groups. Please reach out to Mandy Barrington (amanda.barrington@maine.edu) and Ryan Weatherbee (ryan.a.weatherbee@maine.edu) with questions.

The post Assessment in the time of COVID-19 appeared first on Office of Institutional Research and Assessment.

]]>
/oira/assessment-in-the-time-of-covid-19/feed/ 0
Navigate Update /oira/navigate-update-2/ /oira/navigate-update-2/#respond Tue, 07 Apr 2020 18:14:43 +0000 /oira/?p=2270 With the switch to remote learning, the Navigate Leadership Team is working to ensure we are doing all that we can to support our students and advisors. Below are usage statistics as of the end of February. In response to COVID-19, EAB has fast-tracked several enhancements to the Navigate Staff platform to help advisors advise […]

The post Navigate Update appeared first on Office of Institutional Research and Assessment.

]]>
With the switch to remote learning, the Navigate Leadership Team is working to ensure we are doing all that we can to support our students and advisors. Below are usage statistics as of the end of February.

In response to COVID-19, EAB has fast-tracked several enhancements to the Navigate Staff platform to help advisors advise remotely. The umaine.edu/navigate website has a Quick Start Guide and video walkthroughs available which will continually be updated. Updates will also be shared with each college’s Navigate Specialist Leader to pass on to advisors.

We are also modifying the student app side of Navigate. The Appointments feature has been turned on in the app, so students are now able to schedule meetings with their advisors who have set availability in the Staff platform. Additionally, we have added a new resource category in the app which links to the COVID-19 FAQ page as well as new links for several offices which have altered their delivery.

The Navigate Graduate Assistant, Lorraine Kouao, has been hard at work on promoting the app to students. She has been tabling in the Union through the months of January and February as well as managing our Navigate social media pages. She has also created a graphic to promote the Study Buddies feature in the app. We are hoping students will find this feature especially helpful to find community with their peers in a virtual setting. We are happy to share a PDF of this flyer with any faculty who would like to encourage their students to use study buddies. Please email kimberly.stewart@maine.edu if you would like information on the app to share with your students.

The post Navigate Update appeared first on Office of Institutional Research and Assessment.

]]>
/oira/navigate-update-2/feed/ 0
Introducing a MAJOR Improvement to our Interactive Enrollment Dashboard… Majors! /oira/introducing-a-major-improvement-to-our-interactive-enrollment-dashboard-majors/ /oira/introducing-a-major-improvement-to-our-interactive-enrollment-dashboard-majors/#respond Tue, 07 Apr 2020 18:09:36 +0000 /oira/?p=2261 Our enrollment dashboard now includes undergraduate and graduate major-level data. Select from the past five fall terms and see enrolled majors by college, department, and individual plans. (Students in multiple majors are counted in each major they are enrolled in. Consequently, the sum of the major counts does not represent an unduplicated student headcount.) The […]

The post Introducing a MAJOR Improvement to our Interactive Enrollment Dashboard… Majors! appeared first on Office of Institutional Research and Assessment.

]]>
majors enrollment dashboard

Our enrollment dashboard now includes undergraduate and graduate major-level data. Select from the past five fall terms and see enrolled majors by college, department, and individual plans. (Students in multiple majors are counted in each major they are enrolled in. Consequently, the sum of the major counts does not represent an unduplicated student headcount.)

The bar chart represents enrollment in the term and appears in descending order by enrollment. For ease of viewing, it is helpful to first apply the three filters at the top: term, college, and department. The filters on the left side then provide additional insights into:

  • Enrollment level (full/part-time status)
  • Sex
  • Class Level
  • Residency (in-State vs. out-of-state)
  • And for undergraduates, a filter to see majors for the Honors Program!

Follow this link to interact with our updated dashboard: /oira/reporting/interactive-data/#enrollment

We plan to develop a future iteration that permits visualization of multi-year trends for particular majors and subgroups.

The post Introducing a MAJOR Improvement to our Interactive Enrollment Dashboard… Majors! appeared first on Office of Institutional Research and Assessment.

]]>
/oira/introducing-a-major-improvement-to-our-interactive-enrollment-dashboard-majors/feed/ 0
Spring Online Student Evaluations of Teaching /oira/spring-online-student-evaluations-of-teaching/ /oira/spring-online-student-evaluations-of-teaching/#respond Tue, 07 Apr 2020 18:05:49 +0000 /oira/?p=2258 Over the past three years, since the Blue portal was introduced for online evaluations, 91 has seen an increase in the number of academic units choosing to go completely online for their student evaluations of teaching (SETs). By fall 2019, approximately 70% of units had voluntarily adopted using exclusively online SETs. With the sudden move […]

The post Spring Online Student Evaluations of Teaching appeared first on Office of Institutional Research and Assessment.

]]>
Over the past three years, since the Blue portal was introduced for online evaluations, 91 has seen an increase in the number of academic units choosing to go completely online for their student evaluations of teaching (SETs). By fall 2019, approximately 70% of units had voluntarily adopted using exclusively online SETs. With the sudden move to remote instruction this semester, all SETS will be conducted online using the Blue portal for the spring 2020 semester. (This will be for this semester only.) However, given the extraordinary circumstances related to the COVID-19 pandemic this semester, the process will differ. AFUM has issued the following update:

For the Spring 2020 semester, the results of the Student Evaluation shall not be placed in the personnel file or used for any negative action. Neither will signed comments be placed in the personnel file.

For purposes of evaluation, reappointment, or promotion, or tenure that includes the Spring 2020 semester, the University shall consider all work evaluated to have been completed in 1 fewer semester. This provision does not change any contractual deadlines.

We want to assure faculty that we will be adhering to these new guidelines while administering SETs this semester. Copies of SET reports will not be shared with academic units — only the faculty members teaching each course.

For those new to the online SET, we encourage reviewing the resources available on the OIRA website. In particular, we recommend this handout, which summarizes the process. In short, there are three different tasks involved:

  • Question Personalization – faculty are given the opportunity to add up to 10 of their own questions to the questionnaire. These will be supplementary to the standard question set.
  • Student access – for the two weeks leading up to the last day of classes, students will have access to fill out their evaluations.
  • Response Rate Monitoring – faculty can monitor response rates in real-time for each of their courses. The monitoring period corresponds with the time students have access to complete their evaluations (last two weeks of classes).

See Table 1 for the dates associated with each of the tasks described above. Communication with faculty and students about each part of the online SET process is done via official 91 email accounts. Messages come from the account of Lisa Henderson.

Table 1: Important dates for tasks in the online SET process. Bold row applies to most courses ending on last official day of classes.

Course End Date Question Personalization Start Date Question Personalization Reminder 1 Question Personalization End Date Student Access Start Date Student Access Reminder 1 Student Access Reminder 2 Student Access End Date
4/27/2020 4/7/2020 4/11/2020 4/13/2020 4/14/2020 4/18/2020 4/26/2020 4/27/2020
4/28/2020 4/7/2020 4/110/2020 4/14/2020 4/15/2020 4/19/2020 4/27/2020 4/28/2020
4/30/2020 4/7/2020 4/11/2020 4/16/2020 4/17/2020 4/21/2020 4/29/2020 4/30/2020
5/1/2020 4/7/2020 4/10/2020 4/19/2020 4/20/2020 4/24/2020 4/30/2020 5/1/2020
5/7/2020 4/9/2020 4/16/2020 4/23/2020 4/24/2020 4/28/2020 5/6/2020 5/7/2020
5/8/2020 4/10/2020 4/17/202 4/24/2020 4/25/2020 4/29/2020 5/7/2020 5/8/2020

If you have any questions or concerns about the move to online SETs please let us know.

Ryan Weatherbee (ryan.a.weatherbee@maine.edu)
Lisa Henderson (lisa.henderson@maine.edu)

The post Spring Online Student Evaluations of Teaching appeared first on Office of Institutional Research and Assessment.

]]>
/oira/spring-online-student-evaluations-of-teaching/feed/ 0