Hostname: page-component-586b7cd67f-rcrh6 Total loading time: 0 Render date: 2024-11-24T19:22:09.621Z Has data issue: false hasContentIssue false

Writing Better Writing Assignments

Published online by Cambridge University Press:  19 June 2014

Allison Rank
Affiliation:
SUNY Oswego
Heather Pool
Affiliation:
Denison University
Rights & Permissions [Opens in a new window]

Abstract

Although most instructors care deeply about student writing, they often give little attention to the part of the writing process over which they maintain complete control: the assignment itself. Yet, the written prompt that we distribute is often where student confusion (and confused writing) begins. Using Bloom’s taxonomy as inspiration, we offer instructors a typology directly linked to course objectives, which we believe can be readily understood by student writers.

Type
The Teacher
Copyright
Copyright © American Political Science Association 2014 

Faculty members care deeply about student writing and turn a critical eye to their syllabus, lesson plans, and teaching style in an effort to improve it (Beyer, Taylor, and Gilmore Reference Beyer, Taylor and Gillmore2013). However, they do not often directly examine the part of the assessment process over which they maintain complete control and on which they rarely receive feedback: the formatting of assignments. We argue that the intent, structure, and wording of a prompt all help promote or impede student learning. In response, we developed a typology of assignment objectives as well as a series of suggestions for structuring and wording prompts. We review each in turn.

In the past three decades, numerous authors have bemoaned the lowered quality of writing on college campuses. The first response was the development of writing centers, with which we have personal experience: we were both directors of a social science writing center at a large public research university. Footnote 1 The observations and suggestions in this article draw on our experiences and our challenges as we began to teach our own classes and develop our own writing assignments. Serving as graduate-student directors of a political science writing center exposed us to prompts from a variety of subfields and course levels. This provided the opportunity to think more deeply about how the wording and structure of writing assignments had an impact on student writing.

The second response to concerns about student writing was a push to develop curriculum that more fully incorporates writing into course structures. Scholars offer a number of suggestions for how to best accomplish this second goal. First, evaluation measures, including exams and longer paper assignments, should be linked directly to the overall learning objectives of the course. Rather than treating writing assignments as a way to measure content mastery, they should be conceptualized along with learning objectives. In Writing in the Academic Disciplines, David R. Russell observes that the shift to mass education and the development of specific disciplines created “specialized text-based discourse communities, highly embedded in the differentiated practices of those communities,” within which “knowledge and its expression could be conceived of as separate activities” (Russell Reference Russell2002, 5). Instructors’ efforts to couple “knowledge and its expression” in their courses are exemplified in the course-design process at McGill University. As they begin to outline courses, instructors are encouraged to conceptualize learning objectives (to be clearly communicated to students) and to create assignments that correspond to these objectives in form and function (Saroyan and Amundsen Reference Saroyan and Amundsen2004).

Second, rather than a final high-stakes assignment, the need to incorporate multiple types of writing with multiple objectives within a course has been increasingly highlighted. Çavdar and Doe (Reference Çavdar and Doe2012) emphasize that if writing is being used as a means to teach critical-thinking skills, then assignments must be scaffolded so that students can respond to feedback and build skills by first completing preparatory assignments. Coffin and colleagues emphasize the importance of nongraded writing as a way to teach students the value of both iterative and free writing as ways to separate “the idea-generating phases of writing from more critical editorial stages” (Coffin et al. Reference Coffin, Curry, Goodman, Hewings, Lillis and Swann2003, 36). Bain similarly suggests that instructors use free-writing assignments as a place for students “to struggle with their thoughts without facing assessment…to try, come up short, receive feedback on their efforts, and try again before facing any ‘grading’” (Bain Reference Bain2004, 57). Scriven’s distinction between formative and summative assignments is particularly useful to distinguish between types of writing assignments: formative assessments emphasize feedback over evaluation, whereas summative assessments involve “making a final judgment about the learning at a particular point in time” (Scriven 1981, quoted in Weston and McAlpine Reference Weston, McAlpine, Saroyan and Amundsen2004, 98).

Finally, instructors have turned a critical eye toward the way that writing skills, particularly discipline-specific skills, are conveyed to students. This is especially important in introductory classes, in which students are expected to grasp the basics of discipline-specific writing that they will need in upper-division coursework. Yet, it is often in introductory-level courses that writing instruction is secondary to content acquisition; that is, introductory courses often rely on “knowledge-telling” assessments rather than those that assess “knowledge-transformation” (Çavdar and Doe Reference Çavdar and Doe2012). To respond to this challenge, Baglione (Reference Baglione2008) highlights the need to break down specific steps of a research project rather than assuming that students understand how to work within the research format. Souva contends that students’ inability to engage in theory building could be addressed by instructors placing “greater emphasis on learning at least a basic system of logic” to help students better understand the “construction of theoretical arguments” (Souva Reference Souva2007, 557). By receiving instruction on the expected format and logic of work in our discipline, students will be better equipped as both readers and writers of discipline-specific content.

These scholars offer several useful suggestions about how course structure and assignment types work together to support efforts to improve student writing. Yet, they do little to clarify how to structure the prompts (as opposed to structuring writing in the course as a whole). From our experience in the writing center and as early-career instructors, we sought literature about how to actually write writing assignments. What we found was helpful in determining goals and types of assignments, but we found little specific guidance about how to actually write a clear and doable writing assignment, pitched at the right level to achieve specific aims. This became our goal in writing this article: to provide guidelines for writing assignments that will improve student writing and learning.

This became our goal in writing this article: to provide guidelines for writing assignments that will improve student writing and learning.

We provide herein a typology for writing prompts that melds these components and provides practical support for instructors. In 1956, the publication of the Taxonomy of Education Objectives, The Classification of Educational Goals, Handbook I: Cognitive Domain—commonly known as Bloom’s taxonomy—provided educators “a basis for test design and curriculum development” (Anderson and Krathwohl Reference Anderson and Krathwohl2001). Bloom’s text outlined six cognitive objectives of increasing complexity that should be included in a comprehensive education program: knowledge, comprehension, application, analysis, synthesis, and evaluation.

Almost 50 years later, an original contributor proposed revisions to account for the considerable changes in education since Bloom’s work was published. Anderson and Krathwohl’s A Taxonomy for Learning, Teaching, and Assessing (2001) updated the original cognitive objectives to the following six cognitive processes: remember, understand, apply, analyze, evaluate, and create (table 1). Anderson and Krathwohl’s work supports primary and secondary education. Footnote 2 In this article, we seek to adapt their revised taxonomy to the concerns of college-level political science classes. We also provide tools to create assignments that move students from content summary to synthetic and analytic writing.

Table 1 Revision of Bloom’s Taxonomy by Anderson and Krathwohl

Note: This chart is reproduced from Anderson and Krathwohl (2001, 67–68).

We summarize our additions and revisions to Anderson and Krathwohl’s cognitive objectives in table 2a. First, we changed the first two cognitive objectives from “remember” and “understand” to “summarize” and “relate.” Given the shift to higher education, these more complex objectives make sense. Our following three objectives overlap those in the revised taxonomy; however, we again revised them slightly to focus on college-level political science courses.

Table 2a Cognitive Objectives, Associated Command Terms, Benefits, and Prerequisites

Second, taking inspiration from the International Baccalaureate Organization’s guidelines for history exams (International Baccalaureate 2008), we present a list of associated command words for each cognitive objective. Terms such as define and describe work well with summary assignments; compare and contrast work well with relational assignments, and so forth. In short, not only do instructors need to consider the goal of each assignment, they also should ensure that the language of the assignment matches the stated goal. To help students understand how these goals are different, instructors perhaps should also provide definitions of the command terms for their students. If the goal of an assignment is to test recall, asking students to “justify” their reasons may not make sense. Likewise, if instructors assign a final, major-application assignment, a term such as define might be used, but it should not be a substantial part of the assignment. It may be a guiding question under a secondary prompt; however, if instructors are asking students to apply recently learned theories to new domains, definitions should perhaps not be the primary goal. We are not suggesting that instructors limit the words they use in writing assignments to only these command terms. However, using terms precisely in a writing assignment does matter. For example, there is a significant difference between asking a student to describe (e.g., “Give a detailed account”) and to discuss (e.g., “Offer a considered and balanced review that includes a range of arguments, factors or hypotheses. Opinions or conclusions should be presented clearly and supported by appropriate evidence.”). If instructors ask students to describe and they discuss, we may be thrilled with the extra effort; however, if we ask students to discuss and they describe, their answers will be deficient. It is the instructors’ responsibility to use terms that clearly state what we want, make obvious what those terms mean, and not penalize students if they do what is asked rather than what we wanted them to intuit. Being clear and consistent regarding command terms is one way to ensure that the question being asked is, in fact, the question to be answered.

Third, we identify the benefits provided to both students and instructors through particular types of prompts. By briefly describing the benefits of specific cognitive objectives, we want to help instructors think more critically about the prompts that they create and how best to explain them to students.

Finally, we identify the prerequisites necessary for students to successfully answer a particular type of prompt. As instructors, we often can clearly see the obvious route to a correct answer. However, we look at prompts with an already disciplined eye; we know the assumptions and limitations of our field, as well as which types of evidence are considered appropriate. Asking students to use their content knowledge without a pointed discussion about what good analysis looks like, what counts as strong evidence, and so forth results in students receiving prompts that they find mystifying.

CONSTRUCTING BETTER WRITING ASSIGNMENTS

Having identified cognitive objectives and preferred terms, how might we now format the overall prompt? As noted previously, instructors regularly bemoan students’ subpar writing skills. We are not arguing that these assessments are incorrect. Rather, we are arguing that as the writers of writing assignments, we may have a role in our students’ inability to write clear papers when we give assignments that do not clearly explain how students are to respond.

We offer suggestions on how to prepare students to understand the purpose of a paper. We suggest beginning with a primary question that is quite broad: for example, “What is the role of property in Locke’s political thought?” A student’s thesis statement should respond to this question. Although we might initially think that questions like this are too broad for an undergraduate course (and perhaps even for a dissertation), we suggest narrowing the field of inquiry by using secondary questions to frame the student’s response. We might follow the primary question, “What is the role of property in Locke’s political thought?,” with two or three secondary guiding prompts featuring clear command terms. For example:

Secondary prompt 1: “Using Locke’s discussion of property in the state of nature, describe the relation between private property and freedom.”

Secondary prompt 2: “Assess Locke’s assertion that concentrations of private property in the hands of some will not interfere with the political freedom of those who do not own real property beyond their bodies.”

These secondary prompts clarify how students should structure their response and also ensure that they grasp that it is the primary question to which their thesis should respond.

Why structure questions in this way? Instructors often use a host of questions that seem to be clear and in order, but students frequently respond with paralysis. They come into a writing center asking: “Which of these questions should I answer? Do I answer all of them? Which is the main question?” We suggest that this is one reason why students fail to write clear thesis statements; they are not sure to which question they should respond. When instructors throw three or four questions at a student, it can be challenging for beginning writers to distinguish between the central question (i.e., the point of the assignment) and secondary questions that serve to direct their focus to particular texts or concepts. Using a primary-/secondary-question framework can first help students clearly see the main point and then structure their essay or exam in relation to that main point; this also ensures that they address the more specific topics posed by the secondary questions. Moreover, this structure helps an instructor locate and communicate an overarching theme that aligns with course objectives, while also providing additional structure that addresses the specifics of each text or the goals of the assignment.

Using a primary-/secondary-question framework can first help students clearly see the main point and then structure their essay or exam in relation to that main point; this also ensures that they address the more specific topics posed by the secondary questions.

TIPS, DEFINITIONS, AND EXAMPLE PROMPTS

In this section, we discuss in more detail how we perceive each cognitive objective fitting into a course, define associated command terms, and provide sample prompts from two subfields.

Summarize

Prompts focused on the cognitive objective summarize ask students to demonstrate their grasp of previously presented material. These prompts require them, in their own words, to communicate information covered in lectures, readings, or prior classes. Although lower-level courses may ask students to summarize in high-stakes assignments (e.g., mid-term or final exams), they are best used in free-writing assignments designed to (1) allow the instructor to determine whether students are ready to move on to more complicated material, or (2) provide a springboard for class discussions.

We suggest avoiding terms such as specify and identify unless a word or phrase is an acceptable answer to the question paired with another summary term. For example, if a question asks students to identify the type of electoral system used in the United States of America, England, and Australia, it would be correct to write the following: “The United States of America has a majority rule system whereas England and Australia use a proportional system.” Conversely, asking students to identify and define or to describe the electoral systems of the United States of America, England, and Australia communicates the expectation of not only naming the systems used but also outlining their particular characteristics. The following command terms are associated with summarize Footnote 3 :

Define: “Give the precise meaning of a word, phrase, concept, or physical quantity.”

Describe: “Give a detailed account.”

Summarize: “Abstract a general theme or major point(s).”

Comparative Politics (CP): “Identify and describe major components of a ‘first-past-the-post’ electoral system.”

International Relations (IR): “Describe how the Treaty of Westphalia created the modern state.”

Relate

Relate prompts focus on the ability of students to develop connections among concepts, events, and actors. These prompts, for example, might attend to how different theories use the same word to describe different concepts, how they use different words to describe the same concept, how one event or person fits in a larger narrative, or how two theories approach one event.

These types of questions work best when the central themes or questions that guide the course were identified and discussed in advance and can be used to shape the relational question. That is, asking students to relate two distinct analyses works best when they have been prepared by thinking about theories or events in conceptually related ways that are held together by major ideas or questions. One way to help students prepare for these types of assignments is to use language or terms consistently in lectures and to ensure that during in-class discussion, key terms and concepts are clearly defined. The following command terms are associated with prompts that ask students to relate:

Exemplify: “Find a specific example or illustration of a concept or principle.”

Classify: “Determine that something belongs to a category.”

Compare: “Give an account of the similarities between two (or more) items or situations, referring to both (all) of them throughout.”

Contrast: “Give an account of the differences between two (or more) items or situations, referring to both (all) of them throughout.”

Distinguish: “Make clear the differences between two or more concepts or items.”

American Politics (AP): “Contrast President Hoover’s ideas about the role of government to President Roosevelt’s understanding of the role of government with evidence from the start of the Depression to the start of World War II.”

Political Theory (PT): “Compare the security of property in Hobbes’s state of nature to the security of property in Locke’s state of nature.”

Analyze

Prompts that ask students to analyze a particular piece of content assess student ability to deconstruct arguments using logic and/or disciplinary standards. Our courses, readings, and assignments are bound by the disciplinary standards of the field as well as each subfield. These standards often include unstated assumptions that are readily apparent to scholars and instructors but that appear from invisible to mystical for students. Most texts, by necessity, feature a set of unstated assumptions ranging from what constitutes power to what is meant by the term institution. We often accept these assumptions without comment despite the influence they may have on the overall direction of a text. When we ask students to analyze (or deconstruct) a particular text, we ask them to critically engage with these often-unstated disciplinary standards. This helps students to understand argumentative structure and disciplinary expectations.

Writing assignments that ask students to evaluate can go beyond the classroom to help them understand why political science matters not just for their grade but also for their political community more generally.

Whereas deep understanding of the contours of a subfield may not be necessary for students in introductory courses, elucidating the (often hidden) ideological assumptions that structure our field can develop students’ critical-analysis skills as well as improve their ability to break down arguments into component parts. These prompts work best in classes in which there are overt discussions about the ideological underpinnings of the subject matter, as well as those that address structures of logic and argument. These prompts presume that students have time to read and analyze new material, which makes them best suited to take-home exams or papers when students can review new material and apply the skill set. The following command words are associated with prompts asking students to analyze a piece of text:

Organize: “Determine how elements fit or function within a structure.”

Attribute/Deconstruct: “Determine a point of view, bias, values, or intent underlying presented material.”

Examine: “Consider an argument or concept in a way that uncovers the assumptions and interrelationships of the issue.”

To What Extent: “Consider the merits (or otherwise) of an argument or concept. Opinions and conclusions should be presented clearly and supported with appropriate evidence and sound argument.”

PT: “To what extent does Locke assume a European standard of property and citizenship?”

CP: “To what extent can the assumptions of democratic peace theory be applied to emerging democracies?”

Evaluate

Prompts that ask students to evaluate test their ability to assess claims according to disciplinary standards by synthesizing the skills discussed previously. The move to evaluation assumes that students are prepared to enter the intellectual back-and-forth that characterizes the development of any subfield. This type of prompt can benefit students in three ways. First, it exposes them to the standards of evidence-based inquiry. Acceptable evidence varies among subfields; explicitly noting the differences gives students a better and broader understanding of the field of political science.

Second, asking students to evaluate helps them to consider implications of the theories being discussed in a given course. The expectations for answering these questions may differ based on subfields; that is, the question of how to consider implications for a theoretical inquiry (e.g., “How does liberalism conceive of citizenship?”) may be different than a policy question (e.g., “How will shifting revenue streams affect services to group X?”). These types of questions are crucial in all subfields and at all course levels because they help students to understand what our discipline as a whole has to offer and to differentiate among the various contributions of our subfields.

Third, this type of assignment may choose to evaluate claims through the lens of gender, race, or postcolonial studies, among others. Integrating prompts of this type into a course demonstrates recognition of the historical and contemporary limitations of political science as a field. It also asks students to evaluate rather than simply accept the conclusions offered by various authors.

With the proper preparation, prompts of this type can help students to think specifically about what this kind of inquiry helps them to see or understand that others do not. Writing assignments that ask students to evaluate can go beyond the classroom to help them understand why political science matters not just for their grade but also for their political community more generally. Two of the command terms associated with this type of prompt are related to examine and to what extent; the other three are specific:

Assess: “Measure and judge the merits and quality of an argument or concept and clearly identify and explain the evidence for your assessment.”

Explain: “Give a detailed account including reasons or causes.”

Justify: “Give valid reasons or evidence to support an answer or conclusion.”

AP: “Given the denial of social security benefits for agricultural and domestic workers and the internment of Japanese Americans during President Roosevelt’s time in office (among other racial exclusions), explain the role of the New Deal in setting the stage for the 1960s Civil Rights Movement.”

IR: “Assess the validity of two of the following three decision-making theories using the Cuban Missile Crisis as an example: rational actor, bureaucratic process, and political process.”

Create

Prompts that ask students to create new content using the knowledge and skills learned in the course are critically important not only for their mastery of a field of knowledge but also because developing this ability is one of the main reasons to seek higher education. Asking students to generate their own research questions, research designs, policy proposals, and other types of content appropriate for a given course helps educators to evaluate not only what they have learned but also whether they can transfer this knowledge to a new domain. Moreover, asking students to apply already learned knowledge to new hypothetical concepts or subject areas engages their creativity far more than recall or comparison questions (they often are more interesting to grade as well). There is more scope for students to think broadly and for instructors to evaluate not only knowledge but also the ability to transfer it across preestablished boundaries.

It is interesting that the prompts that ask students to create new content tend to be either the most open (e.g., “Write a research paper on a topic related to the course material”) or the most detailed (e.g., three to four pages of directions on topic restrictions, directions for writing style, lists of issues that must be considered, and so on). Yet, it is often unclear how much class time is spent preparing students with the skills necessary to undertake this level of creation. Within the context of the course, students should see the cognitive objectives (i.e., relate, analyze, and evaluate) put into practice. Furthermore, an assignment with multiple opportunities for guidance and feedback will support students in learning how to turn their critical eye toward their own ideas. We do not offer specific command terms for this type of prompt because several sources provide excellent suggestions, and we believe this should be an iterative and scaffolded process between an instructor and a student.

Reflect

There is one final type of assignment but we see it as standing outside of the general typology, as illustrated in table 2b.

Table 2b Cognitive Objectives, Associated Command Terms, Benefits, and Prerequisites

Asking students to reflect and assess their own views and opinions in light of the knowledge gained through their coursework provides an opportunity to incorporate an important aspect of the knowledge side of the revised Bloom’s taxonomy: that is, meta-knowledge (or meta-cognition). Whereas reflect questions increasingly appear in courses with service-learning opportunities, we encourage instructors who ask students to write multiple papers to also ask them to reflect on how the feedback on their first paper influenced their approach to their second. Because this prompt asks students to evaluate their own experiences, it should use terms that encourage reflection on the process of writing, as follows:

Example: “Read over the feedback you received on your first draft. In no more than one page (single-spaced), please:

(1) Describe your writing process.

(2) Reflect on what you can do to improve the process of writing to address concerns about the product of your writing.

(3) Explain what you did differently as you worked on your second draft to address the identified concerns.”

CONCLUSION

The literature on integrating writing into curriculum provides invaluable suggestions about why and when we ask students to write, which is a crucial step. However, as early-career instructors seeking to integrate writing into our own courses in a meaningful way, we found little guidance about how to ask students to write. Our aim in this article is to provide a resource for our peers as they help students to become better writers.

ACKNOWLEDGMENTS

The authors thank panelists, discussants, and audience members at the Western Political Science Association Conference and the 2013 APSA Teaching and Learning Conference for their helpful feedback. Thanks are also due to the American Political Science Association and the University of Washington for financial support to attend these conferences. We also extend our gratitude to Erin Richards at Cascadia Community College (CCC) in Bothell, Washington, for several helpful suggestions about how to improve this article and for an opportunity to present this research to the CCC faculty.

Allison Rank will be an assistant professor at SUNY Oswego in Fall 2014. She can be reached at .

Heather Pool is an assistant professor at Denison University. She can be reached at .

Footnotes

1. Heather Pool was the director of the Political Science/Law, Societies & Justice/Jackson School of International Studies Writing Center at the University of Washington, Seattle, from 2009 to 2011; Allison Rank succeeded her from 2011 to 2013. The Center offered discipline-specific support for all types of writing at any stage of the process. The Center typically offered more than a thousand student tutoring sessions each year.

2. Although this is not stated explicitly, the text does not provide examples set in a college-level classroom, whereas it does provide numerous examples that fit primary- and secondary-school settings.

3. These command terms and those in the following sections are drawn from International Baccalaureate (2008, 90) and Anderson and Krathwhol (2001, 67).

References

REFERENCES

Anderson, Lorin W., and Krathwohl, David R., eds. 2001. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Longman.Google Scholar
Baglione, Lisa. 2008. “Doing Good and Doing Well: Teaching Research-Paper Writing by Unpacking the Paper.” PS: Political Science and Politics 41 (3): 595602.Google Scholar
Bain, Ken. 2004. What the Best College Teachers Do. Cambridge, MA: Harvard University Press.Google Scholar
Beyer, Catharine Hoffman, Taylor, Edward, and Gillmore, Gerald M.. 2013. Inside the Undergraduate Teaching Experience: The University of Washington’s Growth in Faculty Teaching Study. Albany: State University of New York Press.Google Scholar
Çavdar, Gamze, and Doe, Sue. 2012. “Learning through Writing: Teaching Critical-Thinking Skills in Writing Assignments.” PS: Political Science and Politics 45 (2): 298306.Google Scholar
Coffin, Caroline, Curry, Mary Jane, Goodman, Sharon, Hewings, Ann, Lillis, Theresa M., and Swann, Joan. 2003. Teaching Academic Writing: A Toolkit for Higher Education. London: Routledge.Google Scholar
International Baccalaureate. 2008. Diploma Programme: History Guide. Cardiff, UK: International Baccalaureate Organization.Google Scholar
Russell, David R. 2002. Writing in the Academic Disciplines: A Curricular History. 2nd ed. Carbondale: Southern Illinois University Press.Google Scholar
Saroyan, Alenoush, and Amundsen, Cheryl, eds. 2004. Rethinking Teaching in Higher Education: From a Course-Design Workshop to a Faculty-Development Framework. Sterling, VA: Stylus Publishing.Google Scholar
Souva, Mark. 2007. “Fostering Theoretical Thinking in Undergraduate Classes.” PS: Political Science and Politics 40 (3): 557–61.Google Scholar
Weston, Cynthia, and McAlpine, Lynn. 2004. “Evaluating Student Learning.” In Rethinking Teaching in Higher Education: From a Course-Design Workshop to a Faculty-Development Framework, ed. Saroyan, Alenoush and Amundsen, Cheryl, 95114. Sterling, VA: Stylus Publishing.Google Scholar
Figure 0

Table 1 Revision of Bloom’s Taxonomy by Anderson and Krathwohl

Figure 1

Table 2a Cognitive Objectives, Associated Command Terms, Benefits, and Prerequisites

Figure 2

Table 2b Cognitive Objectives, Associated Command Terms, Benefits, and Prerequisites