Hostname: page-component-586b7cd67f-t7fkt Total loading time: 0 Render date: 2024-11-23T22:43:35.185Z Has data issue: false hasContentIssue false

Uses and Limits of Data and Student Feedback in Pedagogical Response to COVID-19: A Case Study

Published online by Cambridge University Press:  21 February 2023

Janet L. Donavan*
Affiliation:
University of Colorado Boulder, USA
Rights & Permissions [Opens in a new window]

Abstract

This article examines how one political science department used data and student feedback to make pedagogical choices about course modalities and pedagogical approaches during the COVID-19 pandemic. This case demonstrates that gathering data from students through surveys and other means and then utilizing that data in decision making is a valuable practice. However, there are constraints on collecting quality data in a crisis. With a need to react quickly and to gather and analyze data in a timely fashion, data-informed and student-empathetic decision making is a more accurate characterization of the outcomes in this case and a more achievable goal for the future than data-driven and student-centered decision making in a crisis. This study concludes that data-informed and student-empathetic decision making may be preferable in circumstances in which the data are inconclusive or support multiple conclusions as well as when there are conflicting needs and preferences among both faculty members and students.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of the American Political Science Association

Universities have embraced data-driven (Brynjolfsson and McElheran Reference Brynjolfsson and McElheran2016; Hora, Bouwma-Gearhart, and Park Reference Hora, Bouwma-Gearhart and Park2017) and student-centered (McClellan Reference McClellan, Ishiyama, Miller and Simon2015; Varga-Atkins et al. Reference Varga-Atkins, Sharpe, Bennett, Alexander and Littlejohn2021) decision making to address challenges facing the academy. Political science has encouraged such practices (McClellan Reference McClellan, Ishiyama, Miller and Simon2015). My institution encourages these practices; as a faculty member and director of undergraduate studies in my department, I have implemented both of these decision-making methods. Doing so has not been without difficulties, and some scholarly research challenges data-driven decision making (Dowd Reference Dowd2005; Taylor, Jr. Reference Taylor2020). Our department has been aware of these critiques and at times we have approached the use of data in decision making with skepticism. This article describes a case study of our department’s attempt to use data-driven and student-centered approaches to decision making during the COVID-19 pandemic. When there is a need to react quickly with limited abilities to gather and analyze data in a timely fashion, data-informed and student-empathetic decision making is a more accurate characterization of the outcomes in this case and an achievable goal for future crisis decision making. I conclude that data-informed and student-empathetic decision making may be preferable when the data are inconclusive or support multiple conclusions and when there are conflicting needs and preferences among stakeholders.

In using a case-study method, I acknowledge my subjectivity as a participant observer. My multiple roles include being a teaching-track faculty member with a high teaching load and associate chair and director of undergraduate studies; therefore, I am on the front line of receiving student complaints. I also was in a position with leverage to implement ideas while gathering and interpreting much of these data. I approached this process as an advocate of data-driven and student-centered learning (Dowling Reference Dowling and Hay2005; Yin Reference Yin2017).

USING DATA AND A STUDENT-CENTERED PERSPECTIVE TO ADAPT TO CHANGING CONDITIONS

I recognized early in the pandemic that I did not know what was going to work or not work under the circumstances. I had taught online classes, but it was different than transitioning large, in-person classes to new modalities. In collaboration with colleagues in the department, we had 37 undergraduate classes and more than 3,000 students to consider in the Spring 2020 semester.

As social scientists, we turned to data. We quickly learned that our data options were limited and that interpreting and applying data was difficult. We also learned that in attempting to be student centered, we had to consider that satisfying the majority of students was not the best approach; considering substantial minorities with different perspectives was a more nuanced approach. We learned that we could not be student centered to the point of ignoring the needs and preferences of faculty members and staff, many of whom were reeling from the fallout of the COVID-19 crisis.

We learned that we could not be student centered to the point of ignoring the needs and preferences of faculty members and staff, many of whom were reeling from the fallout of the COVID-19 crisis.

WHAT WERE OUR CONDITIONS?

Faculty members had limited abilities to affect the larger decision-making processes. Our campus, like many, abruptly ended in-person learning on March 13, 2020. For many, this was also the day that our children ended in-person learning and when other family members transitioned to working at home or COVID-19–restricted in-person work. In Summer 2020, our university created and defined six teaching modalities that would be used in the fall. In-person learning included courses that would be taught fully in person whenever campus was open. Hybrid in-person/remote learning included courses in which some students would attend in person on assigned days and others would attend remotely via Zoom. Hybrid in-person/online learning required students to attend class in person on assigned days, when the material was repeated two to three times a week during in-person meetings. Students completed asynchronous work online on the other class days. Remote-learning courses were those in which students were in attendance at the scheduled time via Zoom for all class sessions. Some faculty members recorded these classes and others did not. Hybrid remote/online courses were those in which students were in attendance at a scheduled time via Zoom for part of the class time and completed asynchronous work online otherwise. Finally, online courses were those for which all coursework was completed asynchronously online.

Departments were asked to maximize in-person learning. Social-distancing restrictions in classrooms limited the ability to hold in-person classes. For example, the largest classroom that our department uses has a normal capacity of 425 students but the COVID-19 capacity was 44; another classroom has a normal capacity of 49 but the COVID-19 capacity was 11. Wearing a mask made certain pedagogical choices difficult. In addition, faculty members were asked to prioritize having at least some proportion of classes that did not meet in person to meet synchronously as remote classes. These modalities remained the same from Fall 2020 to Fall 2021, although the social-distancing requirement was eliminated in Fall 2021 for in-person classes. For Spring 2022, most classes were taught in person at normal capacities. Like many institutions, we had mask mandates, testing requirements, and vaccine and booster requirements. In a range in which some universities and colleges returned to fully in person and others to fully remote/online with every combination in between, our conditions were in the middle in terms of restrictiveness compared to others (Walke, Honein, and Redfield Reference Walke, Honein and Redfield2020). Our conditions were in keeping with findings that suggested for top-tercile institutions that those more reliant on tuition and fees were more likely to return in Fall 2020 to hybrid or in-person modes regardless of political climate. For other levels of institutions, political factors were more important drivers (Blanco et al. Reference Blanco, Floyd, Mitchell and Hughes2022).

WHAT DATA DID WE HAVE?

In Spring 2020, we added a qualitative question to our Senior Survey, asking graduating seniors the following: “We have asked you to complete your evaluation of the program considering what your experience was before the move in March 2020 to remote-only learning. Now we want to know more about your experience with the shift to remote learning. Please share any feedback you think we should know about this transition.” Of 174 graduating seniors, 50 completed the survey and 24 answered this question. Before the Fall 2020 semester, our department chair hosted a town hall meeting for faculty members and graduate students to ask questions, share experiences, and offer ideas about the upcoming semester. In Fall 2021, we surveyed political science majors using our department’s Canvas site. This survey went out in October during a temporary campus closure; while the survey was in the field, a return to in-person learning was announced. We received 112 responses. I produced a report on the survey; the department then held a town hall via Zoom for our majors to present results and take students’ questions. We included a similar qualitative COVID-19 question on the Senior Survey for Spring 2021 and Spring 2022 (Donavan Reference Donavan2023).

RESPONSE AND PARTICIPATION RATES

Following are the survey responses and participation rates for the period under study:

  • Spring 2020 Senior Survey: 50 of 174 responded

  • Spring 2021 Senior Survey: 51 of 185 responded

  • Spring 2022 Senior Survey: 18 of 180 responded

  • Fall 2021 Survey of Majors: 112 of 991 majors responded

This study uses these four surveys of students as data, with limited references to other sources of data. Our efforts are consistent with attempts reported in the literature on gathering student feedback during COVID-19, including the use of multiple-wave surveys (Loepp Reference Loepp2020), using feedback tools such as Moodle (Ray Reference Ray2020), and acknowledging that student feedback may take out frustrations of the pandemic on faculty members (Steele Reference Steele2020).

WHAT DID WE LEARN?

First, our data and ability to collect data were limited. The only routinely collected data available were faculty course questionnaires. In addition to the well-known limits of student evaluations, the campus had changed to a new questionnaire in Spring 2020. Therefore, we had no baseline data to evaluate differences in pre- and post-pandemic scores; we did not find institutional data useful. Our departmental interest in collecting data was strong but our capacity was limited. Student participation, even with incentives, in opportunities to contribute data is always low but it was lower during the pandemic. Presumably, students felt overwhelmed with more pressing concerns in their life. Although it was limited, we collected useful information. Our samples were voluntary, and we were unable to collect demographic data to determine how unrepresentative they were due to the risk of individually identifying students. We suspect but cannot confirm that those students with the most technological problems or who were so overwhelmed that they stopped participating in classes were the least likely to provide feedback. We had sufficiently large samples and varied data to identify that groups with contradictory responses existed, and that whatever the relative size of these groups in the population of majors, they were present and had needs and preferences that should be addressed.

We had sufficiently large samples and varied data to identify that groups with contradictory responses existed, and that whatever the relative size of these groups in the population of majors, they were present and had needs and preferences that should be addressed.

Second, we learned that most people were fairly satisfied with how things were going under the circumstances. We had similar findings through the Spring 2022 semester. We asked in the Fall 2020 Undergraduate Majors Survey: “Thinking specifically of your classes in the PSCI department, how do you think the semester is going so far?” Of the 106 students who answered, 50 responded “excellent” or “good,” 35 “average,” and 21 “poor” or “terrible.” In the Spring 2020 Senior Survey, 13 of 24 qualitative responses to the department’s COVID-19 response were positive, five were negative, three were mixed, and three offered suggestions without an evaluation. In the Spring 2021 Senior Survey, of 23 qualitative responses, there were five positive evaluations of the department, eight negative, and three mixed; three responses were negative evaluations of the university and four offered comments that did not include an evaluation. In the Spring 2022 Senior Survey, of eight qualitative responses, four were positive evaluations of the department, two were negative, and one was a negative evaluation of the university. Tellingly, one student reported not having anything to which to compare the COVID-19 experience because they had completed all of their political science courses during the pandemic. We could have taken these numbers to mean that we responded well or, to paraphrase several qualitative comments, “We did the best we could under the circumstances.” However, we decided that although these evaluations were a relief, too many students were having negative experiences and those who were doing well may have been more likely to respond.

Third, we learned from surveys and qualitative comments that feedback on what was not working for students and faculty members was contradictory and conflicting; however, there were identifiable groups with distinct problems. This is the most important finding and the one that we focused on the most in adjusting our approach. Examples from the Fall 2021 survey that presented us with conundrums are listed in tables 1 through 3.

Table 1 “We will most likely have the same class modalities in the spring that we have this fall, with a very limited ability to have fully in-person classes. Of these options, which modality would you MOST like to see in the spring? Keep in mind, all of these modalities will likely be available.”

Table 2 “Again, thinking of the same options, which of these would you LEAST like to see in the spring?”

Table 3 “How do you feel about the use of online discussion forums in PSCI courses?” (Students were asked a filtering question of whether they have at least one PSCI class with online discussion forums this semester.)

Note: The total is rounded up from 99.9%.

Other questions had similar results. Different students preferred different approaches to pandemic pedagogy; for any choice, there was a proportion who liked and disliked each approach. It was tempting to throw up our hands and decide that adjusting was impossible. The qualitative responses also presented contradictory suggestions and evaluations. After we discussed the findings, it became clear that faculty preferences also varied. Therefore, a strategy of attempting to meet all of these preferences emerged.

GETTING STUDENTS AND FACULTY INTO THE “RIGHT” MODALITIES

Even with limited data, the clearest findings were that although most students were doing well, the proportion who were not doing well and/or were simply miserable was unacceptably high. However, given that almost every modality and pedagogical choice had a substantial minority of students who were unhappy, what were we to do? The answer that we settled on was trying to get students and faculty into the right modalities. There is controversy in the literature about whether meeting their preferred modality is what is best for students. Research indicates that students in online-only courses are less likely to be successful; however, efforts to meaningfully engage students can balance the outcomes between in-person and online courses (Glazier Reference Glazier2021). Moreover, students looking for easier courses are more likely to enroll in online courses (O’Neill et al. Reference O’Neill, Lopes, Nesbitt, Reinhardt and Jayasundera2021). However, none of the existing research available to us at the time addressed health issues around COVID-19. In an environment in which the success of COVID-9 protection measures was unclear (Walke, Honein, and Redfield Reference Walke, Honein and Redfield2020), some students believed that the health risks or burdens of the pandemic outweighed any advantages of in-person learning. Others felt that the costs to their academic success were too high and in-person learning was necessary. Given that we lacked objective data about which position was accurate, we decided that also in the absence of clear data about which modalities would be safest and most effective, we should take student and faculty preferences at face value.

First, we identified several categories of students whose needs were not being met, as follows:

  • students with a strong desire or need for in-person learning

  • students without access to technology for synchronous remote learning

  • students who were unable to manage the responsibility of self-paced asynchronous learning

  • students experiencing outside problems (e.g., health, mental health, and economic difficulties)

It was important to address the needs of these students without upending the relative success of other groups of students. Many were unable or unwilling to engage in in-person learning due to their own or family members’ health issues. Some students preferred synchronous remote instruction; others preferred asynchronous online options. The intensity of these preferences was judged, in part, by qualitative comments and communication from students to the department, and we recognized that other students may have similarly intense preferences that were not expressed.

This seemed like quite a conundrum but, given that faculty members also had varied preferences, we were able to respond to these data. We surveyed faculty members and scheduled Spring 2021 courses with all available modalities; in most cases, we met their preferences. We advertised modalities by posting on our website and on our student Canvas page and by sending information to the political science advisors. We also announced these measures in a town hall meeting in November 2020. Scheduling proceeded using a similar process for Fall 2021; in Spring 2022, most courses were taught in person at full-room capacities.

Second, we recognized that each faculty member and each student was experiencing their own unique pandemic and we empathized with all of these experiences. For example, I wanted to be on campus because I do not have a workspace at home and I was sharing limited Internet with my family. Other faculty members were supervising small children, needed to be cautious due to their own health or that of aging family members for whom they were a caregiver, or lived alone and felt isolated. Similarly, some students had comfortable spaces at their parents’ home while others were sharing cramped apartments; some were caring for family members; and others were scrambling to find employment due to the canceling of their regular job. Because the situation was tumultuous, it initially was difficult to understand the different perspectives that everyone was experiencing.

To accommodate this, frequent meetings, consulting faculty members, and reviewing data gathered from students were useful. Despite limited data, we could comprehend the many unique pandemics that people were experiencing and create plans to accommodate them. It is important to note that the data allowed us to empathize with those experiences. Faculty feedback and student qualitative data were helpful in making decisions to respond to individual people who were not just a number. Following are two representative qualitative responses from the Spring 2020 Senior Survey:

  • “I felt like one of my PSCI handled it really well but the other one did not seem particularly willing to accommodate the difficulties this shift placed on some students. I really hate online learning and I find it harder to maintain my focus and actually absorb information when I am not in a classroom setting. It seemed like one of my classes understood that and appreciated that some students felt that way, while the other class just expected the same level of engagement and achievement without making any kind of changes to help students do that.”

  • “Just because classes went online does not mean professors need to up the workload—that was mainly my experience with my poli sci classes going online and, with everything else going on in the world at the time, it was overwhelming and I struggled to keep track of the new and old assignments since due dates kept changing.”

We considered these comments in the context of knowing that faculty members and graduate students were suffering as well. Most were doing all they could in transitioning classes in a way that was fair to students while also maintaining their own as well as departmental and institutional standards.

FUTURE OF THE USE OF DATA IN MAKING UNDERGRADUATE PROGRAM DECISIONS

We were limited in our ability to use and collect data. We did not find any institutional data that could help us make decisions. Data collection was limited by resource constraints, students being too overwhelmed to provide data, and voluntary participation. After the initial enthusiasm about our ability to make data-driven decisions, I began to appreciate that it was necessary to consider the limitations of the data. The information we had allowed us to understand the perspectives of various students, to see that their experiences and those of other faculty members were not the same as our own, to consider multiple perspectives in decision making, and to open regular lines of communication for students to provide feedback on what was and was not working.

The information we had allowed us to understand the perspectives of various students, to see that their experiences and those of other faculty members were not the same as our own, to consider multiple perspectives in decision making, and to open regular lines of communication for students to provide feedback on what was and was not working.

We also learned that being student-centered has limitations. We could not be student-centered to the point of further damaging faculty and staff morale or not being able to deliver the curriculum in ways that would successfully prepare students for future courses and life after graduation. Students’ perspectives were the most important factor in our decision-making processes but could not be the only factor.

Ultimately, rather than achieving a data-driven and student-centered response, I concluded that the decision-making processes were data-informed and student-empathetic. This was appropriate due to the limited, conflicting findings from the data and varying needs and preferences among both students and faculty members.

We could have made better use of the data with more resources. Students wanted more in-person opportunities than we were able to offer. In some cases, this was because faculty members needed to be remote or online. In other cases, they were able and willing to meet in person but space was not available due to course sizes. Moreover, because almost all of our classes were full, it was difficult to move people from one modality to another.

Our department currently is normalizing data-collection processes to inform future decisions. We are working on a regular process for surveying majors each year and seniors every spring. By doing so, we will be able to add questions on future crises while collecting longitudinal data outside of the pandemic context. We are planning to have an annual town hall and will continue to encourage faculty members to use Canvas to survey students and to check in with them in class. In these efforts, we will continue to be data informed and student empathetic and our ability to respond to crises will be enhanced by our COVID-19 experience.

This case study of one department’s transition from data-driven, student-centered to data-informed, student-empathetic decision making contributes to the literature on using data to inform our teaching. This study also suggests data-collection options that are possible with real-world constraints, and it supports the careful and empathetic consideration of student needs and perspectives as well as the requirements and interests of faculty and staff.

DATA AVAILABILITY STATEMENT

Research documentation and data that support the findings of this study are openly available at the PS: Political Science & Politics Harvard Dataverse at https://doi.org/10.7910/DVN/2RGHFP.

CONFLICTS OF INTEREST

The author declares that there are no ethical issues or conflicts of interest in this research.

References

REFERENCES

Blanco, Tyler D., Floyd, Bryan, Mitchell, Bruce E. II, and Hughes, Rodney P.. 2022. “Varied Institutional Responses to COVID-19: An Investigation of Colleges’ and Universities’ Reopening Plans for Fall 2020.” American Educational Research Association. DOI:10.1177/23328584221099605.10.1177/23328584221099605CrossRefGoogle Scholar
Brynjolfsson, Erik, and McElheran, Kristina. 2016. “The Rapid Adoption of Data-Driven Decision Making.” American Economic Review 106 (5): 133–39. DOI:10.1257/aer.p20161016.10.1257/aer.p20161016CrossRefGoogle Scholar
Dowd, Alicia C. 2005. Data Don’t Drive: Building a Practitioner-Driven Culture of Inquiry to Assess Community College Performance. Indianapolis, IN: Lumina Foundation for Education. https://files.eric.ed.gov/fulltext/ED499777.pdf.Google Scholar
Donavan, Janet L. 2023. “Replication data for ‘Uses and Limits of Data and Student Feedback in Pedagogical Response to COVID19: A Case Study.’” PS: Political Science & Politics. DOI:10.7910/DVN/2RGHFP.10.7910/DVN/2RGHFPCrossRefGoogle Scholar
Dowling, Robyn. 2005. “Power, Subjectivity and Ethics in Qualitative Research.” In Qualitative Research Methods in Human Geography, second edition, ed. Hay, Iain, 1929. Oxford: Oxford University Press.Google Scholar
Glazier, Rebecca. 2021. Connecting in the Online Classroom. Baltimore, MD: Johns Hopkins University Press.10.1353/book.98266CrossRefGoogle Scholar
Hora, Matthew T., Bouwma-Gearhart, Jana, and Park, Hyoung Jun. 2017. “Data-Driven Decision Making in the Era of Accountability: Fostering Faculty Data Cultures for Learning.” Review of Higher Education 40 (3): 391426. DOI:10.1353/rhe.2017.0013.10.1353/rhe.2017.0013CrossRefGoogle Scholar
Loepp, Eric D. 2020. “Introduction: COVID-19 and Emergency E-Learning in Political Science.” PS: Political Science & Politics 54 (1): 169–71. DOI:10.1017/S1049096520001511.Google Scholar
McClellan, E. Fletcher. 2015. “Best Practices in the American Undergraduate Political Science Curriculum.” In Handbook on Teaching and Learning in Political Science, ed. Ishiyama, John, Miller, William J., and Simon, Eszter, 315. Northampton, MA: Edward Elgar Publishing.Google Scholar
O’Neill, Kevin, Lopes, Natalia, Nesbitt, John, Reinhardt, Suzanne, and Jayasundera, Kanthi. 2021. “Modeling Undergraduates Selection of Course Modality: A Large-Sample, Multi-Discipline Study.” The Internet and Higher Education 48:111. https://doi.org/10.1016/j.iheduc.2020.100776.Google ScholarPubMed
Ray, Ayesha. 2020. “Teaching in Times of Crisis: COVID-19 and Classroom Pedagogy.” PS: Political Science & Politics 54 (1): 172–73. DOI:10.1017/S1049096520001523.Google Scholar
Steele, Brent J. 2020. “When Good Enough Is Good Enough: Department Chairing during COVID-19.” PS: Political Science & Politics 54 (1): 187–88. DOI:10.1017/S104909652000572.Google Scholar
Taylor, Leonard D. Jr. 2020. “Neoliberal Consequence: Data-Driven Decision Making and the Subversion of Student Success Efforts.” Review of Higher Education 43 (4): 1069–97. doi.org/10.1353/rhe.2020.0031.CrossRefGoogle Scholar
Varga-Atkins, Tunde, Sharpe, Rhona, Bennett, Sue, Alexander, Shirley, and Littlejohn, Allison. 2021. “The Choices That Connect Uncertainty and Sustainability: Student-Centered Agile Decision-Making Approaches Used by Universities in Australia and the UK During the COVID-19 Pandemic.” Journal of Interactive Media in Education 16 (1): 116. doi.org/10.5334/jime.649.Google Scholar
Walke, Henry T., Honein, Margaret A., and Redfield, Robert R.. 2020. “Preventing and Responding to COVID-19 on College Campuses.” Journal of the American Medical Association 324 (17): 1727–28. DOI:10.1001/jama.2020.20027.10.1001/jama.2020.20027CrossRefGoogle ScholarPubMed
Yin, Robert. 2017. Case Study Research and Applications: Design and Methods, sixth edition. Newbury Park, CA: SAGE Publishing.Google Scholar
Figure 0

Table 1 “We will most likely have the same class modalities in the spring that we have this fall, with a very limited ability to have fully in-person classes. Of these options, which modality would you MOST like to see in the spring? Keep in mind, all of these modalities will likely be available.”

Figure 1

Table 2 “Again, thinking of the same options, which of these would you LEAST like to see in the spring?”

Figure 2

Table 3 “How do you feel about the use of online discussion forums in PSCI courses?” (Students were asked a filtering question of whether they have at least one PSCI class with online discussion forums this semester.)

Supplementary material: Link

Donavan Dataset

Link