Hostname: page-component-cd9895bd7-p9bg8 Total loading time: 0 Render date: 2024-12-25T20:08:26.154Z Has data issue: false hasContentIssue false

Yes, But Did They Learn Anything? An Experimental Investigation of Voter Decision Making on Foreign Policy Issues

Published online by Cambridge University Press:  12 October 2016

Jacqueline M. Sievert
Affiliation:
Bowling Green State University
Michael K. McDonald
Affiliation:
Western Carolina University
Charles J. Fagan
Affiliation:
Western Carolina University
Niall Michelsen
Affiliation:
Western Carolina University
Rights & Permissions [Opens in a new window]

Abstract

Do non-graded, one-time, short presentations by a panel of professors on foreign policy issues affect voting behavior among students? Did the panel itself contribute to students’ understanding of the importance of foreign policy in evaluating candidates? Did presentations lead to changes in students’ candidate preferences? And, finally, did the event lead to sustained changes in students’ preferences? We find that even though issues of foreign policy tend not to be front and center in American elections, when young voters are presented with information about candidate’s foreign policy positions, as we did in this study, it does seem to have an impact on which candidate they plan on voting for.

Type
Articles
Copyright
Copyright © American Political Science Association 2016 

INTRODUCTION AND BACKGROUND

As educators, we teach with the faith that our efforts result in student learning. As researchers, we investigate whether this is true and if so, how true. As political scientists, we are interested in whether this learning actually influences voter preferences in primaries and elections. As international relations scholars, we are interested in how foreign policy issues influence voters. Fortunately, we are able to induce students to attend specified teaching/learning events to address all these questions. Therefore, this March, in the midst of the presidential primary season and just a few weeks before the North Carolina primary, the authors headlined an open panel presentation as part of our ongoing Global Spotlight Series, titled “Presidential Candidates’ Foreign Policies.”

The audience for this event was primarily undergraduate students, along with some faculty, staff, and community members. Students were recruited through course requirements, extra credit, or general interest. We employed instantaneous polling (www.socrative.com) using smartphones, to measure students’ opinions and to capture any change in their opinions (Gikas and Grant Reference Gikas and Grant2013). We employed pre- and post-treatment questions as well as follow-up questions two weeks later.

Using this audience and polling technology we set out to answer a series of questions. Do non-graded, one-time, short presentations on foreign policy issues affect voting behavior among students? Did the panel itself contribute to students’ understanding of the importance of foreign policy in evaluating candidates? Did presentations lead to changes in students’ candidate preferences? And, finally, did the event lead to sustained changes in students’ preferences?

Our expectations were that we would detect some changes in student perceptions and political preferences, but that the changes would be limited. We know that foreign policy is only one (albeit sometimes important) element in a voter’s calculations. We also know that voters, old and young, are not well informed about US foreign policy and even less well informed about the differences in foreign policy position among candidates for presidential nominations.

The impact of foreign policy on voters’ calculations is generally secondary to domestic issues, although this varies depending on the international situation at the time. For example, even while conflict in Afghanistan, Syria, and elsewhere was ongoing, surveys in 2012 indicated only 7% of Americans had listed an international issue as their single most important deciding factor (Hook Reference Hook2016, 245). This may be just as well since much has been documented about the general lack of knowledge among American voters generally, and on foreign policy specifically. This dearth of knowledge is, like other political knowledge, even more pronounced among young people (Pew Research 2010).

These expectations were borne out in our pretest of the study. The majority of participants did not know which candidates’ foreign policy position most closely aligned with their own—one-third of participants said they were “middle of the road” indicating neither knowledgeable or unknowledgeable about foreign policy, while roughly 25% of participants reported being somewhat or very unknowledgeable about foreign policy. Footnote 1

As educators, we recognize this as an opportunity to both help students learn about the important issue of foreign policy while also testing to see if increasing knowledge leads to any attitudinal changes, up to and including changing voting preferences. Others have explored the potentialities of deliberative democracy in connecting knowledge and opinions (Fishkin and Luskin Reference Fishkin and Luskin2005). While students in this exercise did not “deliberate” much aside from some Q & A at the end, they did gain knowledge and understanding of foreign policy. The event did approach an educational ideal wherein educators impart knowledge that provides students with greater information and understanding to make better political choices. Naturally, we cannot determine from this or any similar data whether students actually make better choices, but we can determine if they make different choices. At the same time, it is important to note that greater knowledge does not always lead to changing opinions, with some research indicating that more accurate information can actually increase the strength of erroneous beliefs (Nyhan and Reifler Reference Nyhan and Reifler2010).

Viewing the event as the treatment of experimental subjects (IRB approval was sought and received), we expected the following sequence to unfold. Students would initially gain information including learning whose foreign policy positions approximated their own positions the best. This new information would lead some students to change their views of the candidates, which would then lead some of them to change their overall voting preferences. This means only a small subset would change their voting preferences with most of the changes happening within parties.

METHODOLOGY

We wanted to understand whether short, interactive presentations on candidates’ positions affect students’ opinions of the candidates and how they intend to vote in the upcoming election. A variety of methods were used to recruit audience members. While some students were required to attend the event, all members of the audience were free to opt out from the polling. Prior to the presentation, students were asked a series of demographic questions, as well as pre-test questions on their own foreign policy knowledge, party identification, the importance of foreign policy, as well as feeling thermometers on each of the candidates using Socrative. Footnote 2

We wanted to understand whether short, interactive presentations on candidates’ positions affect student’s opinions of the candidates and how they intend to vote in the upcoming election.

After an introductory presentation on the role of foreign policy in elections, presentations on the candidates’ positions on trade, issues in the Middle East, and security were made by three faculty members from the department of political science and public affairs. During each presentation, the candidates’ positions were discussed, but the candidates’ names were withheld. Footnote 3 Students then voted on which candidates’ position they most preferred for that topic. Once the votes were collected, the candidates’ names were revealed so students could see who they voted for.

Following the presentations, students were asked the same foreign policy knowledge, importance of foreign policy, and feeling thermometer questions in the post-test survey. Students were also asked to participate in a follow-up survey distributed two weeks after the event to see how their positions changed days and weeks after the event. Due to drop-off, 60 students completed the pre-test questions, 55 completed the post-test questions, but only 50 completed both the pre- and post-test questions. Thirty students volunteered to be contacted for a follow-up study and 13 completed the survey.

Figure 1 shows the demographics of students that elected to participate in the study. There were a total of 60 students that completed the pre-test questionnaire, but only 50 who completed the post-test—25 male students and 25 female. The majority of students were between the ages of 18 and 20, with very few students over the age of 22. With respect to major and college, the majority of students are political science or international studies majors (both housed in the department of political science and public affairs) or other majors within the College of Arts and Sciences. There were roughly five students from the College of Business and five from the College of Heath and Human Sciences, and fewer than five students were undeclared. With respect to ideology, the sample clearly skews liberal—one-third of students identified as Liberal or Very Liberal. As for party identification, a small plurality identified as independent, one-third identified as a Democrat, and only around 15% identified as Republican.

Figure 1 Demographics of Student Participants

RESULTS

Following the presentations, students were asked the same questions regarding foreign policy knowledge, the importance of foreign policy, which candidate’s positions best aligned with their own position, and who they would vote for if the election were held today. Figure 2 displays results of these responses for the subset of students who completed both pre- and post-presentation surveys. With respect to foreign policy knowledge, there is a slight increase in the self-reported knowledge of respondents. In the pretest, 13 students reported being very or somewhat unknowledgeable about foreign policy, while 21 students reported being very or somewhat knowledgeable about foreign policy. In the post-test responses only eight students responded they were still very or somewhat unknowledgeable about foreign policy while 24 students now reported being very or somewhat knowledgeable about foreign policy. Similarly, modest shifts are apparent in the foreign policy importance question as well—in the pretest, three students did not know if foreign policy was important and three students reported foreign policy being not at all important to them, while 28 reported foreign policy being somewhat important and 16 students reported foreign policy being very important. In the post-test responses two students still did not know if foreign policy was important to them, but now no student reported foreign policy being not at all important.

Figure 2 Pre– and Post– Test Responses

The largest shifts in pre- and post-test responses come in voting behavior and in which candidate best aligned with the students’ own positions. In the pretest, 17 students reported they did not know which candidate best aligned with their own views. Thirteen students reported aligning best with Bernie Sanders’ positions, seven with Hillary Clinton, and six with Donald Trump. In the post-test, only two students reported not knowing which candidate most aligned with their views, but 21 students now said Hillary Clinton most closely aligned with their own views, followed by Bernie Sanders with eight and Donald Trump with seven. Therefore, by learning more about the foreign policy positions of the candidates, Hillary Clinton became the candidate whose positions aligned most closely with the student participants.

Finally, there were subtle changes in how students planned to vote. Prior to the presentations, eight students did not know who they planned to vote for. The largest vote recipient was Bernie Sanders, with 28 votes, followed by Marco Rubio with six, Donald Trump with four, and Ted Cruz getting three votes. Only one student had planned to vote for Hillary Clinton. After the presentations, only three students did not know who they would vote for. Bernie Sanders was still the largest vote recipient, but he lost eight votes to end with 20. Hillary Clinton picked up the most votes going from only one vote to 10. Donald Trump and Marco Rubio remained the same, while Ted Cruz lost one vote.

After each presentation, students voted for the candidate whose policy most closely aligned with their preferences, but the policies were presented anonymously. As a result, students did not know which candidate advocated which policy. Once the votes were tallied the names were revealed. Interestingly, in the first section on trade, 21 out of 60 respondents selected Donald Trump, and 34 respondents admitted they were surprised by their choice. Even though a third of respondents selected Trump’s trade policy, Trump’s final vote share remained the same.

LESSONS LEARNED

In many ways, this is a preliminary study of how students’ voting preferences change in response to short presentations of new information. Nonetheless, we have learned several important lessons from this exercise, both in how students’ preferences change and how to use this method in future research. First, we should acknowledge that drawing conclusions from a primarily undergraduate student population presents unique challenges and faces some limitations (Sears Reference Sears1986; Mintz et al. Reference Mintz, Redd and Vedlitz2006; Cooper et al. Reference Cooper, McCord and Socha2011). In this study, we perhaps see evidence of what Sears called “less crystallized attitudes” in the fluidity of the preferences expressed by our young subjects.

In the pretest, a plurality of the students reported that they did not know which candidate best aligned with their own foreign policy preferences. After the hour-long presentation, only two students reported not knowing, showing that short presentations of candidates’ foreign policy positions can help students better understand the candidates’ foreign policy. Similarly, the large swing towards Hillary Clinton and away from Bernie Sanders on this same question shows that students do not necessarily change their own views on foreign policy to match their preferred candidate (Sanders was the largest vote recipient in both the pre and posttests). This is contrary to research that shows voters align their positions to match their preferred candidate, not find a candidate that matches their preferred positions (Achen and Bartels Reference Achen and Bartels2016).

While the swing towards Clinton on foreign policy alignment did not allow her to overcome Sanders’ large lead in the reported voting preferences, it did help her close the gap. This could have important implications for the candidates in the fall who are aiming to capture the important bloc of young voters that supported Bernie Sanders. Footnote 4 While issues such as the economy, affordable health care, lowering the cost of college, and other bread and butter issues are usually seen as the primary issues young voters focus on, we find that young voters also care about foreign policy. Yet many times foreign policy is not what candidates focus on when addressing younger voters—for example, Bernie Sanders, who has based his campaign around younger voters to a large extent, includes very little about foreign policy on his campaign website. Footnote 5 Yet when young voters are presented with information about candidate’s foreign policy positions, as we did in this study, foreign policy does seem to have an impact on which candidate they plan to vote for.

We also learned important lessons about using this method of interactive polling to study student voting preferences and its impact on learning. This fall we plan on repeating this exercise with some important changes. First, in the study reported here, the vast majority of students were either political science/international studies majors or majors in other departments of the College of Arts and Sciences. To get a better sample of students, we plan on giving short presentations to multiple sections of general education courses to get a better distribution of majors. Second, a primary goal is to see if this method of interactive polling actually helps students learn. To test this, we plan on administrating brief pre- and post-treatment tests on foreign policy knowledge (in the current study we only asked for self-reported foreign policy knowledge). While this will help us show learning, we will also have some groups of students not participate in the interactive polling to see if it actually makes a difference.

CONCLUSION

While issues such as the economy, affordable health care, lowering the cost of college, and other bread and butter issues are usually seen as the primary issues young voters focus on, we find that young voters also care about foreign policy. When young voters are presented with information about a candidate’s foreign policy positions, it does seem to have an impact on which candidate they plan to vote for. We found that information about specific policy positions garnered a large swing in favor of Clinton, when asked which candidates’ positions most closely aligned with their own. There was also a shift in votes toward Clinton, although that shift was smaller than the shift in policy alignment, which suggests that some participants realized they preferred Clinton’s foreign policy positions but still did not plan to vote for her.

We believe we have preliminary evidence that short, specific presentations of information can influence students’ political preferences and increase their knowledge of foreign policy issues.

Additionally, while only 13 students completed the follow up survey that was distributed two weeks after the event, when asked about the format of the event many commented on the interactive nature of the presentations. One respondent noted that the “interactiveness [sic] made it more personal and relatable,” while another said “the interactive nature of the presentation kept it interesting” and that they “didn’t spend a campus event glancing at [their] phone clock every two minutes which was a welcome change to be so engaged.”

We believe we have preliminary evidence that short, specific presentations of information can influence students’ political preferences and increase their knowledge of foreign policy issues. We know, at least anecdotally, that students left more engaged and enjoyed the interactive polling and that real-time feedback and increased participation in the panel event.

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit http://dx.doi.org/10.1017/S104909651600158X.Footnote *

Footnotes

1. Data and discussion of methodology can be found in subsequent sections.

2. Details of all questions can be found in the appendix. The event took place on March 2, 2016 and candidates Hillary Clinton, Bernie Sanders, Donald Trump, Ted Cruz, and Marco Rubio were included.

3. Slides from the presentations can be found in the online supplemental appendix.

4. For one of many news articles reporting on Sanders’s dominance among young voters, see Decker Reference Decker2016. According to the Iowa Entrance Poll, Sanders defeated Clinton 84 to 14 among voters aged 17–29 (Iowa Entrance Poll 2016).

5. For example, as of May 27, 2016, Sanders’s webpage had 34 issues listed, and among those at most 5 or 6 dealt with foreign policy, and, even among those, many take mostly a domestic angle towards the issue (Sanders Reference Sanders2016).

* The URL to access Supplementary Material for this article has been corrected since the original publication. An Erratum detailing this change was also published (DOI: 10.1017/S1049096516002481).

References

REFERENCES

Achen, Christopher and Bartels, Larry. 2016. “Democracy for Realists: Holding up a Mirror to the Electorate.” Juncture 22 (4): 269275.Google Scholar
Cooper, Christopher A., McCord, David A., and Socha, Alan. 2011. “Evaluating the College Sophomore Problem: The Case of Personality and Politics.” The Journal of Psychology 145 (1): 2337.Google Scholar
Decker, Cathleen. 2016. “Why Young Voters Are Flocking to Sanders and Older Ones to Clinton.” Los Angeles Times. April 19. http://www.latimes.com/politics/la-na-clinton-sanders-age-20160419-story.html.Google Scholar
Fishkin, James S. and Luskin, Robert S.. 2005. “Experimenting with a Democratic Ideal: Deliberative Polling and Public Opinion.” Acta Politica 40 (3): 284298.Google Scholar
Gikas, Joanne and Grant, Michael M.. 2013. “Mobile Computing Devices in Higher Education: Student Perspectives on Learning with Cellphones, Smartphone & Social Media.” The Internet and Higher Education 19 (October): 1826.Google Scholar
Hook, Steven W. 2016. 5th edition. US Foreign Policy: The Paradox of World Power. Washington, DC: CQ Press.Google Scholar
Mintz, Alex, Redd, Steven B., and Vedlitz, Arnold. 2006. “Can We Generalize from Student Experiments to the Real World in Political Science, Military Affairs, and International Relations?” The Journal of Conflict Resolution 50 (5): 757–76.Google Scholar
Nyhan, Brendan and Reifler, Jason. 2010. “When Corrections Fail: The Persistence of Political Misperception.” Political Behavior. 32 (2): 303–30.Google Scholar
Sanders, Bernie. 2016. “Bernie 2016.” https://berniesanders.com/issues/.Google Scholar
Sears, David O. 1986. “College Sophomores in the Laboratory: Influences of a Narrow Data Base on Social Psychology’s View of Human Nature.” Journal of Personality and Social Psychology 51 (3): 515–30.CrossRefGoogle Scholar
Figure 0

Figure 1 Demographics of Student Participants

Figure 1

Figure 2 Pre– and Post– Test Responses

Supplementary material: File

Sievert et al. supplementary material

Sievert et al. supplementary material 1

Download Sievert et al. supplementary material(File)
File 84.7 KB
Supplementary material: File

Sievert et al. supplementary material

Sievert et al. supplementary material 2

Download Sievert et al. supplementary material(File)
File 1.2 MB
Supplementary material: PDF

Sievert et al. supplementary material

Sievert et al. supplementary material 3

Download Sievert et al. supplementary material(PDF)
PDF 54.2 KB
Supplementary material: File

Sievert et al. supplementary material

Sievert et al. supplementary material 4

Download Sievert et al. supplementary material(File)
File 57.4 KB
Supplementary material: File

Sievert et al. supplementary material

Sievert et al. supplementary material 5

Download Sievert et al. supplementary material(File)
File 340.3 KB

A correction has been issued for this article: