Hostname: page-component-586b7cd67f-gb8f7 Total loading time: 0 Render date: 2024-11-28T01:42:36.935Z Has data issue: false hasContentIssue false

Using Alumni Views to Connect the Past, Present, and Future in Political Science

Published online by Cambridge University Press:  12 June 2017

Eric D. Raile
Affiliation:
Montana State University
Elizabeth A. Shanahan
Affiliation:
Montana State University
Michael P. Wallner
Affiliation:
Boise State University
Linda M. Young
Affiliation:
Montana State University
Marja Avonius
Affiliation:
University of Tampere, Finland
Micaela Young
Affiliation:
Montana State University
Nacer Tayeb
Affiliation:
Montana State University
Rights & Permissions [Opens in a new window]

Abstract

This article describes the collection of views from political science alumni via a web-based survey as a central part of efforts to review and improve the curriculum and the broader political science program at a public university. Based on the literature and on interviews with faculty members and former students, we iteratively constructed a questionnaire containing five categories of items: program structure, content/knowledge, skills, outcomes, and learning environment. These categories were intended to capture curricular elements and outcomes that include but extend beyond employment and professional-skill attainment. Graduate students contributed in meaningful ways to the effort through a research-methods course. The article discusses how results of the survey fed into the curriculum-revision process specifically and program review and assessment considerations more generally.

Type
Articles
Copyright
Copyright © American Political Science Association 2017 

Despite much initial (and some ongoing) pushback against externally mandated assessment in recent decades, political science has a history of evaluating program quality and student learning and engagement (Deardorff Reference Deardorff2016b; Young Reference Young2016). More recent overviews of these assessment efforts include the Wahlke Report (Wahlke Reference Wahlke1991), Assessment in Political Science (Deardorff, Hamann, and Ishiyama Reference Deardorff, Hamann and Ishiyama2009), and the January 2016 Profession Symposium in this journal. As broadly construed, assessment has become a common part of the landscape in political science departments and has assumed a variety of different forms including but extending beyond assessment of student learning (Deardorff Reference Deardorff2016b; Young Reference Young2016). Within this context, we developed curriculum-review procedures that contribute to a more general program review, which is itself a form of assessment (Deardorff Reference Deardorff2016a). Whereas others discuss models for organizing a political science curriculum (e.g., McClellan Reference McClellan, Ishiyama, Miller and Simon2015), this article describes procedures for collecting data from program alumni to inform a curriculum review and for feeding this information into broader, ongoing assessment activities.

We designed a questionnaire (see the online appendix) and data-collection procedure that allow for systematically gathering relatively low-cost information while also serving pedagogical goals by engaging students in a service-learning project. The questionnaire balances professional skills and outcomes (recognizing pressures to consider employment outcomes) against other benefits of higher education. This article describes the process of developing and administering the instrument as well as applying the results. This survey was only one input—albeit a vital one—in the curriculum-review process. Though others may want to modify the instrument and procedure for their own particular contexts, we hope that our discussion provides an advanced launching point for such efforts.

CURRICULUM REVIEW IN THEORY AND PRACTICE

A faculty that wants to assess the curriculum can draw useful guidance from various sources. However, departments may wonder whether guidance from professional associations, academic literature, and existing data sources is comprehensively relevant to their particular circumstances. The structures and goals of political science programs vary considerably (Wahlke Reference Wahlke1991), and a curriculum review must consider these characteristics. Consequently, we join others (e.g., Robinson Reference Robinson2013; Schomburg and Teichler Reference Schomburg and Teichler2005) in arguing that alumni experiences and reflections also constitute a valuable source of information for a program. Alumni are relatively proximal to the program, which is important given program diversity. Alumni also may provide more reliable answers than current students because of their distance from the day-to-day academic workload. Furthermore, only alumni know how they are using the knowledge and skills acquired during their postsecondary studies, and this information should feed back into the system. Consequently, alumni are key to connecting the past, present, and future of a program.

Furthermore, only alumni know how they are using the knowledge and skills acquired during their postsecondary studies, and this information should feed back into the system. Consequently, alumni are key to connecting the past, present, and future of a program.

The goals of a political science program—crucial for this type of evaluation—typically appear in the form of general mission statements and more operationalized outcomes or objectives. These goals often include a combination of learning “how to think” via development of general intellectual abilities in the liberal arts tradition (Wahlke Reference Wahlke1991), skill training, learning of specific content, and/or preparing to be an informed citizen and democratic participant. Increasingly, programs also must respond to pressures to increase retention and graduation rates. Tracking relatively closely with these goals, we distilled the literature and our own preliminary data gathering into the following categories for a curriculum review: (1) program structure, (2) content or knowledge, (3) skill development and competencies, (4) outcomes or student achievements, and (5) learning environment.

The literature provides ample recommendations about program structure in political science. The establishment of a relatively large set of core courses in the curriculum and the strong sequencing of courses are important starting points (Bennett Reference Bennett1991; Ishiyama Reference Ishiyama2005; Wahlke Reference Wahlke1991), with warnings against proliferation of courses (Ishiyama, Breuning, and Lopez Reference Ishiyama, Breuning and Lopez2006). Greater structure in the curriculum is helpful for abstract reasoning and aptitude test scores (Ishiyama Reference Ishiyama2005; Ishiyama and Hartlaub Reference Ishiyama and Hartlaub2003). Exposure to the different modes of inquiry in political science (Wahlke Reference Wahlke1991) and requiring an early research-methods course (Ishiyama Reference Ishiyama2005) are other recommendations. A requirement for off-campus learning (Wahlke Reference Wahlke1991) and an intensive capstone experience (Huerta Reference Huerta2015; Ishiyama Reference Ishiyama2005) constitute additional recommendations. Our questionnaire operationalized program structure by asking about major concentrations, core courses, course sequencing, and flexibility and frequency of course offerings. Moreover, the questionnaire asked about the benefits of curricular “supplements” (not required) such as internships and study abroad.

Student knowledge is another traditional emphasis in the discipline. Although knowledge is the sum of different processes, its most direct relationship is with curriculum content. From the perspective of alumni, knowledge is the more salient concept. Beyond consideration of the traditional subfields of political science (Wahlke Reference Wahlke1991), investigators highlight the need for greater integration of research methods (Parker Reference Parker2010); the scientific nature of the discipline (Hill Reference Hill2002); citizenship (Smith and Graham Reference Smith and Graham2014); gender (Cassese, Bos, and Duncan Reference Cassese, Bos and Duncan2012); ethnic and cultural diversity (Bennett Reference Bennett1991); and related fields including history, economics, and geography (Wahlke Reference Wahlke1991). The questionnaire addressed knowledge via a series of items about how well studies in political science had equipped alumni with basic knowledge in 11 different areas.

Skills have assumed greater importance in the discipline over time (Ishiyama, Breuning, and Lopez Reference Ishiyama, Breuning and Lopez2006). Recommendations about skill development emphasize analytic capacities, research-design skills, statistical skills, and writing and oral-presentation skills (Bennett Reference Bennett1991; Breuning, Parker, and Ishiyama Reference Breuning, Parker and Ishiyama2001; Wahlke Reference Wahlke1991). The questionnaire asked alumni about their preparation to use various communication, research and analysis, and professional skills, as well as the importance of each skill in the respondents’ lives.

The evaluation of outcomes for the program and alumni is typically a straightforward means of evaluating success as measured against program goals. Assessment of learning protocols may evaluate student knowledge and skills throughout the course of studies but often do not examine the knowledge of former students. Previously mentioned questionnaire items captured certain programmatic and self-reported learning outcomes. However, fitting with the recent emphasis on employment, the questionnaire addressed additional alumni outcomes through items about the nature of employment, satisfaction with employment characteristics, and actual and anticipated further studies.

Finally, we asked alumni to evaluate the learning environment because of its strong relationship with the curriculum as well as its emphasis in pre-survey interviews. The questionnaire asked about the quality of teaching and teaching methods, experiences in classes, faculty accessibility, and advising, which are mostly standard items gleaned from student evaluations of their instructors.

QUESTIONNAIRE DESIGN AND DATA COLLECTION

We collected views from alumni of the political science undergraduate program at Montana State University–Bozeman, a research-intensive, public land-grant university in the northwestern United States. The political science program had about 150 majors enrolled at the time of the survey and annually graduated approximately 35 to 40 students in recent years. The political science major also had allowed students to concentrate in one of four areas: international relations, policy and analysis, political institutions, or political theory. The first area was available beginning in 2006 and the other three in 2009. The department also offers a Master of Public Administration (MPA) program with a typical population of about 25 students.

By assisting in this review, MPA students taking a research-methods course learned about the collection, management, and analysis of both qualitative and quantitative data.

Conducting this curriculum review served a number of pedagogical goals. By assisting in this review, MPA students taking a research-methods course learned about the collection, management, and analysis of both qualitative and quantitative data. They conducted a literature review under the professor’s supervision. They also developed an interview protocol and conducted interviews of purposively selected departmental faculty members and alumni. The interviews identified important concepts not found in the literature that might be important for our specific program. The students coded the interview text into nodes and subnodes and conducted analyses using qualitative analysis software (i.e., NVivo). They then worked with their professors to design the questionnaire. This iterative process involved multiple rounds of meetings as well as reviews of the text and questionnaire structure using web-based survey software (i.e., Qualtrics). The MPA students’ contributions were important in conceptually shaping the questionnaire and in refining the wording and formatting. The hands-on application of ideas and analytical techniques allowed them to experience the course material more tangibly, which resulted in high levels of interest and engagement. During the pilot phase of the project, a few dozen undergraduate students also participated by providing feedback about the construction and wording of the questionnaire. The undergraduates were given an opportunity to participate based on their enrollment in certain courses.

Given resource constraints and the relatively young age of the target population, we decided that a web-based questionnaire was most appropriate. The researchers worked with the university’s administration to obtain information about the population of interest, which included all students awarded a political science undergraduate degree by the institution from the fall of 1999 to the spring of 2014. Students with multiple majors were included only if they identified political science as their primary major. The population included 470 students; however—after supplementing university records with faculty records—valid e-mail addresses were available for only 308 of them. To avoid spam filters, the department head provided a hyperlink to the questionnaire in batches of 20 e-mail addresses. Alumni received a reminder from the department head several days later, and the survey closed after 15 days. The survey yielded a response rate of 34.4% (i.e., 106 of the 308) from valid e-mail addresses, although more recent graduates constituted a larger percentage of respondents. The respondents were 58.5% male and 39.8% female; 1.6% preferred not to answer. Footnote 1 The distribution of respondents in terms of gender and major concentration mirrored the population figures well.

SELECTED FINDINGS

This section discusses selected findings that might be generally interesting or useful. All percentages discussed are the percentage of responses after removing the “not applicable” option.

Program Structure

Large majorities of alumni viewed the structural elements of the program as “very useful” or “fairly useful.” Respondents viewed the requirement to take core foundation courses as particularly useful and the sequencing of courses as less useful. However, only about a third claimed to have taken courses in sequence, which likely influenced their views. They were relatively less satisfied with the frequency of course offerings (6.3% “excellent”; 50.0% “good”). Respondents also generally viewed curriculum supplements as “very beneficial.” In descending order, these supplements were study abroad (70.4%), internships (70.0%), political campaigning (51.7%), community activism (36.7%), student clubs (36.2%), and volunteering (32.8%). Open-ended comments called for the department to do more in terms of helping students obtain internships and communicating about internships. For example, one respondent wrote, “Push for more internships. Even more than more.”

Content/Knowledge

Large majorities of alumni believed themselves well equipped across the 11 content areas. The one exception was foreign language (28.8% “very well” or “fairly well”), in which case they acknowledged that taking only one or two courses fell short of mastery. Open-ended comments called for more thorough coverage and stronger requirements related to research methods and design, quantitative research and statistical analysis, and foreign languages. In a representative comment, one respondent suggested “integrating more quantitative and analysis-type courses into the curriculum.”

Skills

The questionnaire categorized skills in three areas: communication, research and analysis, and professional. Again, large majorities believed they were prepared (either “very well” or “fairly well”). The specific skills that received the most positive responses were critical thinking (59.8% “very well” prepared), reading comprehension (52.3%), and writing (48.6%). Responses about professional skills, particularly leadership (26.2%), also were positive but trailed the other skills. In terms of skill importance, standouts were writing, interpersonal communication, critical thinking, locating information, and reading comprehension—all “very important” at or higher than 80%. The skills of applying theory to research questions (27.1% “very important”) and analyzing numerical data (37.4%) were at the lower end of importance in respondents’ daily lives. Open-ended comments covered multiple directions, but several respondents advocated for more hands-on activities and real-world applications. A representative comment was: “Getting some ‘real-world’ experience during college years is fundamental to be able to jump right into the job market.”

Outcomes

For current employment, 45.3% of respondents reported being employed in the private sector, 27.4% in the public/governmental sector, and 13.2% in the nonprofit/nongovernmental sector; the question did not apply for 14.2%. Large majorities were “very satisfied” or “somewhat satisfied” with the fit, level of responsibility, location, and weekly number of hours of their position. About half of the respondents reported that they had already begun or finished further studies; 40% anticipated enrolling in additional studies. In the open-ended comments, some alumni voiced frustration with student debt and job opportunities for political science graduates.

We underestimated the perceived importance of experiences such as internships and study abroad.

Learning Environment

Overall, respondents positively evaluated quality of teaching, experiences in classes, and faculty performance. However, they noted that advising—particularly about careers—and providing information about activities and opportunities could be better. Majorities typically rated information provision as “fair” or “poor.”

Although interest in our specific findings is likely to vary, we encountered some unexpected results. We underestimated the perceived importance of experiences such as internships and study abroad. We also were surprised by the number of students who were working in the private sector as well as the percentage of alumni who went on to further studies.

ACTIONS TAKEN AND FURTHER STEPS

The survey results fed into a collaborative faculty process for curriculum revision, which was a part of the department’s broader, ongoing assessment efforts. The department decided to make significant changes to the curriculum based on the alumni survey results, recommendations from the literature and the discipline, and faculty input. Table 1 outlines the process from goals and objectives to information gathering to outcomes. The first column indicates the university and department goals or objectives under which the changes were categorized. The middle column lists potential areas for improvement based on survey results and review of the literature. The third column shows the consequential actions taken by the department.

Table 1 Actions Based on Alumni Survey Results and Literature Review

Note: Other goal categories in the 2012 strategic plan at the university level are engagement, integration, access, and stewardship. This table addresses the most relevant goals that resulted in departmental action.

As a result of the curriculum review, the department submitted formal paperwork for a curriculum revision. It would expand the core to be more inclusive of subfields, provide for stronger course-sequencing requirements, and better integrate content and skill development across courses and levels. Additionally, the revision would enhance the research-methods sequence, with the primary change being a new mandatory 300-level research-experience course. Furthermore, although alumni viewed the concentrations favorably overall, the department decided that maintaining the concentrations conflicted with other key observations. In particular, the new curriculum would ensure that courses are available and that students have curricular flexibility—both key observations from the survey. The expanded core would improve foundational breadth in terms of knowledge and skills, and students could decide whether to pursue depth or breadth in the upper-division courses (thereby improving flexibility). Finally, the department took seriously the feedback about advising and providing information about internships and other opportunities by adding central advising services and a department newsletter. In summary, our low-cost survey generated findings that have resulted in rapid implementation of programmatic and curricular changes.

The purpose of sharing our curriculum-review process, instrument, and decisional outcomes is to provide an example of a process that engaged students and faculty in meaningful and productive ways. The results fed into our larger assessment activities and provided data about learning and employment outcomes that are of interest to prospective students. We also learned from the data-collection process. For example, a more comprehensive study might locate students who either changed majors or did not graduate to solicit their views. Additionally, a program might consider better incorporating departmental or institutional “values” (e.g., citizenship and freedom; see Smoller Reference Smoller2004) into the questionnaire. Finally, departments may benefit from integrating this type of questionnaire into formal assessment of learning procedures as well as using the results to leverage additional resources. Regardless of the specific approach, we believe alumni constitute a reserve of invaluable information to help departments evaluate their present state and plan for their future.

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit https://doi.org/10.1017/S1049096517000695.

Footnotes

1. Marginal percentages may not sum to 100.0% exactly due to rounding.

References

REFERENCES

Bennett, Douglas C. 1991. “Political Science within the Liberal Arts: Towards Renewal of Our Commitment.” PS: Political Science & Politics 24 (2): 201–04.Google Scholar
Breuning, Marijke, Parker, Paul, and Ishiyama, John T.. 2001. “The Last Laugh: Skill Building through a Liberal Arts Political Science Curriculum.” PS: Political Science and Politics 34 (3): 657–61.Google Scholar
Cassese, Erin C., Bos, Angela L., and Duncan, Lauren E.. 2012. “Integrating Gender into the Political Science Core Curriculum.” PS: Political Science & Politics 45 (2): 238–43.Google Scholar
Deardorff, Michelle D. 2016a. “Assessing over Time: The Political Science Program Review.” PS: Political Science & Politics 49 (1): 103106.Google Scholar
Deardorff, Michelle D. 2016b. “Assessment in Political Science Redux: Introduction.” PS: Political Science & Politics 49 (1): 83–7.Google Scholar
Deardorff, Michelle D., Hamann, Kerstin, and Ishiyama, John (eds.). 2009. Assessment in Political Science. Washington, DC: American Political Science Association.Google Scholar
Hill, Kim Quaile. 2002. “The Lamentable State of Science Education in Political Science.” PS: Political Science & Politics 35 (1): 113–16.Google Scholar
Huerta, Juan Carlos. 2015. “Challenges to, and Suggestions for, Merging Research and Teaching in Undergraduate Regional Public Universities.” PS: Political Science & Politics 48 (1): 5860.Google Scholar
Ishiyama, John. 2005. “The Structure of an Undergraduate Major and Student Learning: A Cross-Institutional Study of Political Science Programs at Thirty-Two Colleges and Universities.” The Social Science Journal 42 (3): 359–66.CrossRefGoogle Scholar
Ishiyama, John, Breuning, Marijke, and Lopez, Linda. 2006. “A Century of Continuity and (Little) Change in the Undergraduate Political Science Curriculum.” American Political Science Review 100 (4): 659–65.CrossRefGoogle Scholar
Ishiyama, John and Hartlaub, Stephen. 2003. “Sequential or Flexible? The Impact of Differently Structured Political Science Majors on the Development of Student Reasoning.” PS: Political Science & Politics 36 (1): 83–6.Google Scholar
McClellan, E. Fletcher. 2015. “Best Practices in the American Undergraduate Political Science Curriculum.” In Handbook in Teaching and Learning in Political Science and International Relations, ed. Ishiyama, John, Miller, William J., and Simon, Eszter, 315. North Hampton, MA: Edward Elgar Publishing, Inc.Google Scholar
Parker, Jonathan. 2010. “Undergraduate Research-Methods Training in Political Science: A Comparative Perspective.” PS: Political Science & Politics 43 (1): 121–5.Google Scholar
Robinson, Andrew. 2013. “The Workplace Relevance of the Liberal Arts Political Science BA and How It Might Be Enhanced: Reflections on an Exploratory Survey of the NGO Sector.” PS: Political Science & Politics 46 (1): 147–53.Google Scholar
Schomburg, Harold and Teichler, Ulrich. 2005. “Increasing Potentials of Alumni Research for Curriculum Reforms: Some Experiences from a German Research Institute.” New Directions for Institutional Research 126: 3148.CrossRefGoogle Scholar
Smith, Michael and Graham, Bob. 2014. “Teaching Active Citizenship: A Companion to the Traditional Political Science Curriculum.” PS: Political Science & Politics 47 (3): 703–10.Google Scholar
Smoller, Fred. 2004. “Assessment Is Not a Four-Letter Word.” PS: Political Science & Politics 37 (4): 871–4.Google Scholar
Wahlke, John C. 1991. “Liberal Learning and the Political Science Major: A Report to the Profession.” PS: Political Science & Politics 24 (1): 4860.Google Scholar
Young, Candace C. 2016. “Survey of Assessment Practices in Political Science.” PS: Political Science & Politics 49 (1): 93–8.Google Scholar
Figure 0

Table 1 Actions Based on Alumni Survey Results and Literature Review

Supplementary material: PDF

Raile supplementary material

Appendix

Download Raile supplementary material(PDF)
PDF 91.8 KB