Faculty Guidance on Generative AI Writing Tools

Updated September 2, 2023

For questions or comments on this page, contact Traynor Hansen, Director of Campus Writing

Introduction

The recent emergence of AI writing software—especially ChatGPT at the end of 2022—has introduced several questions about the role of writing in educational environments. If students can input an assignment prompt into ChatGPT and get something that looks like a passable response (in readable prose!), what does that mean for the way we typically assign writing and for the place of writing in our pedagogy? Is the college essay dead? Or should we give up on take-home essays and ask our students to do all of their writing by hand in class, just like the “olden days”?

These questions feel existential for a lot of faculty, and for good reason. The Writing Program at SPU is concerned not only with the final papers that students write but, even more importantly, with the process through which students generate, draft, revise, edit, and proofread those eventual final papers. We are also committed to helping faculty work through some of those issues so that we can respond thoughtfully and carefully to the challenges presented by AI writing tools.

The suggestions that follow are not meant to be once-and-for-all solutions. Nor are they meant to send ChatGPT back to where it came from, never to darken the doorsteps of university writing classes again. Instead, they are meant as an invitation to think about some of the unique and familiar challenges presented by AI writing tools—and also about better and worse ways to respond to those challenges. This is just one part of the effort at SPU to develop a thoughtful response to the potential impacts of AI on higher education.

Contents

Be Mindful of Your Feelings

For many folks, the entire conversation surrounded by ChatGPT and its alternatives invokes feelings of fear, dread, and despair. These tools seem to threaten core elements of our pedagogical and assessment practices. These feelings are intensified by the media frenzy, both in the popular press and in higher-ed publications, as well as the hype that has surrounded the release of ChatGPT, breathlessly announcing every moment that ChatGPT “passes the bar exam” or demonstrates that it can “pass freshman year at Harvard.” In addition to the media coverage, when many non-specialists (including me) encounter language like “artificial intelligence,” “machine learning,” and “neural networks,” we are often faced with rhetorically powerful language that speaks more to an apocalyptic imagination than to the sometimes more mundane challenges of teaching college writing.

As academics, we are well-practiced in receiving rhetorically-charged media stories with an appropriate degree of skepticism. At the same time, judging by the tenor of the conversation surrounding ChatGPT among educators, AI writing tools seem to speak to our sense of professional security in unique ways.

Setting aside—just for a moment—the degree to which these fears are justified or not, it is worth considering where we are getting our knowledge of AI writing tools, including how they work; what threat they pose to college writing specifically and to higher education more generally; and what they “can do.”

When you think about your own feelings about ChatGPT, for instance, where you have encountered the most worrisome discussion of its dangers? If through news sources, what kinds of expertise have been included in the reporting, and what kinds of expertise are missing or underrepresented? How has social media (and its algorithms) influenced your thinking, whether through technologists declaring how “we can’t prepare for what’s coming,” colleagues and other educators declaring their plans to shift to fully in-class writing, or their enthusiasm for fully incorporating ChatGPT into their classes to “prepare our students for careers in a new AI-driven world”? Have you heard horror stories from colleagues who have received papers written by ChatGPT, or do you even have some stories of your own?

These are all the kinds of questions that we encourage our students to think carefully through as we teach them information literacy. That doesn’t mean that our ability to teach these questions makes us immune to getting caught up in sensationalized accounts that can act powerfully on us. As we get caught up, our fears and anxieties are stimulated, and, for many of us, our fight-or-flight impulses can drive us to reactionary measures rather than measured responses that play to the pedagogical strengths we have developed over the years.

So, while there may be real causes for concern over how AI writing tools impact how we assign, teach, and assess writing in higher ed, the first step in addressing those concerns should be taking stock of our own feelings and the reasons for those feelings. This might also mean being willing to cultivate an openness to responding in a variety of ways.

Is ChatGPT Cheating?

Resist the urge to roll your eyes at this one. I have had students cheat through ChatGPT, at least by the standards by which we usually identify plagiarism: submitting work that was produced by someone/thing other than the student as if it was their own. And yet, there are many ways that AI writing tools might be implemented in learning environments. (This is not necessarily to say that they should be implemented in these ways.) How are we to distinguish between full-scale text generation (e.g., using ChatGPT to produce a draft), smaller-scale text generation (like the autocomplete suggestions already built into many word processors and email programs) and automated, AI-enhanced grammar/usage checkers (such as Grammarly or Microsoft Word)? When it comes to these software offerings—many of which we have adapted to with little objection—where is the place on the spectrum where academic integrity has been violated for you and why there?

You should expect that some professors will allow students to use these tools in their classes. Some will even encourage students to use them. These means that students may encounter wildly different expectations and postures from class to class and from professor to professor.

If you consider using these tools to constitute cheating or a violation of academic integrity, it is worth thinking carefully about why and in what ways—and then communicating those expectations to students as clearly as possible. What are the learning goals for your class, as laid out on your syllabus? At what points will turning to AI writing tools undermine students’ ability to achieve those goals and/or your ability to assess their understanding of key elements or performance of important skills?

Offer Clear Guidelines for Appropriate and Inappropriate Usage

Do you hope to ask for the absolute prohibition of AI writing tools in your class (and commit to practicing that prohibition yourself)? Or are there certain instances where students may use these tools without violating academic integrity?

For example, can a student turn to ChatGPT for an explanation of a difficult course concept or a summary of a challenging text? How would this differ from looking up difficult concepts on Wikipedia, through YouTube “explainer” videos, or on SparkNotes or other popular study guides? These are resources that students are already familiar with.

While many faculty do not necessarily encourage the use of these resources, neither do we tend to consider them a violation of academic integrity. Rather, we teach students about their shortcomings. We encourage students to approach them with a higher degree of skepticism and critical thinking. We also articulate the values of students puzzling through challenging problems for themselves and with the support of others (including classmates, study tutors, writing center coaches, and professors).

Consider offering guidelines such as these from Temple University, which describe acceptable and unacceptable ways that students might use generative AI tools. If you decide to permit students to use these tools in some situations and not in others, make sure to let students know the reasons behind different instances.

Example guidelines:

  • “We’ve discussed how ChatGPT can usually offer general overviews of certain concepts but also fails to consider the more complex and nuanced understanding that we have as scholars in this discipline. For this assignment, ask ChatGPT to explain [x-topic] and then write 150 words of your own about what ChatGPT seems to miss in its explanation. Be prepared to discuss the explanations generated by ChatGPT and your response in our next class.”
  • “Write 250 words summarizing and responding to [x-scholar]’s argument. This is a dense article geared towards specialists in [x-scholar]’s field, so expect it to feel challenging. While it might be tempting to turn to outside resources like Google, Wikipedia, or ChatGPT for help with this, those resources are off-limits for this assignment. I don’t expect you to understand everything [x-scholar] is talking about, but I do want to see you puzzling through the implications of their claims. We’ll use your responses to fill in some of the gaps together in our next class.”

Teach Openness and Transparency

Regardless of what policies you adopt for AI writing tools, the Writing Program encourages openness and transparency in the way we discuss the tools with students—and, if you use them, about the way you do so. When we assume students are looking for shortcuts or for ways to cheat, we create a culture of suspicion and surveillance. (In writing classrooms—where the model of successful performance is often some variation of standard edited academic English—a professor’s unconscious biases may also lead them to direct that suspicion disproportionately toward multi-lingual writers, first-generation students, neurodivergent students, and underprepared students.)

Rather than suspecting that students are looking for ways to cheat and overemphasizing the dire consequences of getting caught, consider implementing strategies for creating a learning environment that promotes integrity and discourses dishonesty, as outlined by James Lang in Cheating Lessons. These strategies (as summarized on the “AI Literacy for Faculty” Subject Guide from the SPU Library) include:

  • fostering intrinsic motivation by connecting the questions of the course to questions that students already have;
  • focusing on learning for mastery (instead of grades) by offering several lower-stakes practices before high-stakes assessment;
  • instilling a sense of self-efficacy by offering direct and formative feedback on low-stakes activities;
  • asking students to reflect on their learning progress at several points in the term.

No matter what feelings you have about AI writing tools or how you plan to permit or restrict them, it is essential to give students clear guidelines for how to acknowledge their use of these tools when they do. If students expect punishment for using these tools, many will feel inclined to hide their usage. On the other hand, providing a framework for legitimate acknowledgment can help to foster the environment of transparency and attribution that we hope that students will apply to any outside source that they depend on.

Example policies/assignment guidelines:

  • “I prefer that you do not use any outside sources (including sources found through web search or material generated by ChatGPT) for this assignment. However, if you do end up using outside sources, it is your responsibility to cite them following MLA/APA/Chicago Style guidelines.”
  • “At the end of your paper, please include an Acknowledgment paragraph that mentions two contributors and what they added to your paper. Although ChatGPT is not a person, please also use this paragraph to describe any role ChatGPT played in producing this draft.”
    • Sample Acknowledgments paragraph: “An early draft of this paper depended on material generated through ChatGPT. At the beginning of the drafting process, I had a hard time knowing how to get started, and I asked ChatGPT for a couple of suggestions that helped me get the brainstorming started. My topic ended up taking a different direction (from x to y) thanks to a discussion with my in-class small group, especially as Jane Doe helped me realize that I had overlooked something crucial about y. Finally, I’d like to thank John Doe in the Writing Studio, who helped me restructure the middle of my paper, where my argument got lost in the details of some of my examples.”

Citation Guidelines

Emphasize Process over Product

One of the great “promises” of AI writing tools is that they seem to offer writers a way to get to the end result of writing (a product in the form of a finished piece of writing) while skipping over some of the more difficult, frustrating, anxiety-inducing, and time-consuming work that many writers would just as soon avoid. This is especially true for many of our students, who often enter academic writing contexts with their own fears about their capability as writers, their performance (and the grades that come with it), and their place within the specialized discourse of academic writing.

In writing classes, we are not focused on simply assigning writing but are invested in helping students develop skills and strategies for becoming effective writers. There are several steps that we can take to reduce the need for students to turn to AI writing tools or other shortcuts that may undermine our learning goals for the course. Many of the suggestions below speak to elements faculty already incorporate into their course design and to their strengths as educators. This is a reminder that one of the best ways to respond to AI writing tools is by leaning into those strengths.

  • Scaffold Assignments Through Multiple Steps/Drafts: Rather than asking students to turn in a major paper (with a heavy grade weight) at the end of a unit or end of the term, have them work through the assignment in several smaller steps. Some faculty follow a sequence of prospectus/proposal, annotated bibliography/review of literature, draft, and final paper. Each of these assignments can be broken up into even smaller steps with lower stakes (see below). They can include revision components that allow students to develop flexible thinking around a topic, encounter more complexity and a wider array of perspectives, and try out different arguments along the way to produce a stronger final paper. While this may sound like an invitation to increase faculty workload by creating more assignments to grade, not every assignment needs to receive the same detailed feedback that we give to more formal writing.
  • Offer Low-Stakes Assignments—and Keep the Stakes Low: As students are given the opportunity to work through small-scale assignments without worrying about a grade that will jeopardize their overall performance in the class, the risk of producing messy, incomplete writing is diminished. Of course, if low-stakes writing is met with critical feedback that points out a writer’s inadequacies rather than pointing to next steps in future drafts and assignments, we might increase the risk that students will turn to outside tools to avoid receiving similar criticism in the future.
  • Highlight the Social Aspect of Writing: We can help students avoid turning to AI writing tools when the process starts to feel overwhelming by reminding them of the human resources that are available to them. More than that, we can ask students to use those resources at several steps in the writing process through in-class group work, peer review workshops (when managed well), one-on-one conferences with the professor, and trips to the Writing Studio. When tied into a scaffolded sequence and a variety of low-stakes assignments, this social element can create opportunities for students to share work-in-progress with others and help them feel supported as part of a writing community that is also in the messy stages of the writing process.
  • Name the Steps of the Process for Students and Show Them How Those Steps Work Together: As helpful as it is to provide scaffolded assignments for students, we run the risk of making several smaller assignments seem like busywork when students can’t see how everything is meant to fit together. Students may also believe that low-stakes assignments are “optional” or unimportant when they do not make a significant impact on a final grade. Both attitudes may lead some students to opt out of crucial steps in the writing process, only to run up against a final assignment that they feel unprepared for and then turn to other resources, including ChatGPT. The more that we, as faculty, can articulate for students how the different steps of the writing process fit together, how they contribute to the learning goals of the class, and how they fit in with students’ educational and career goals, the better. As experienced educators know, it is helpful to remind our students of these connections several times over the course of a term.
  • Ask Students to Reflect on Their Own Writing Processes: As we seek to foster students’ self-awareness of their own strengths as writers, we can create multiple opportunities for them to take stock of their learning through formal and informal reflection activities. Examples of reflection assignments include:
    1. asking students, at the beginning of the term, to talk about their past experiences with writing or what they feel their strengths and weaknesses are as writers;
    2. asking students, after submitting a draft, to include a Writer’s Memo that explains a challenge they ran into while writing, what they think works best, and what they anticipate they might need to focus on in the next draft;
    3. asking students, after revision, to describe the most important changes they made to a draft, why those changes matter, and what they still feel uncertain about;
    4. asking students, near the end of the term, to collect their work and reflect on its strengths and weaknesses, using the language of the course learning outcomes. (This final example is what students produce for the Final ePortfolio in all sections of WRI 1000 at SPU.)
  • When it comes to AI writing tools like ChatGPT, these opportunities allow students to take stock of their own development as writers, including (when appropriate) ways that AI writing tools have contributed to their writing process and ways that it ended up not being as useful as they thought it would be.

Conclusion

Once again, these suggestions are not meant to be comprehensive. As AI writing tools continue to develop in unexpected ways, our responses will have to as well. Nevertheless, I hope this has provided some useful ideas for faculty who are trying to discern how best to approach the AI writing tools and the impact they may have on what we teach, how we teach it, and how we assess student work.

Sample Syllabus Statements

General Plagiarism and Academic Integrity Statement

“The use of another’s work in writing—or of work you have completed for another class—without citing their contribution, whether intentional or accidental, is a serious offense. This applies equally to work produced by other writers or by AI writing software (such as ChatGPT). If you are ever in a situation in which you are concerned about whether your presentation of information is plagiarism or not, it is your responsibility to vet your writing with an instructor or writing tutor before you present it as your own work. Plagiarism or academic dishonesty of any sort is not tolerated.”

Plagiarism and Academic Integrity Statement for Writing Process-Focused Classes (e.g., WRI 1000 and WRI 1100)

“The Writing Program distinguishes between unintentional and intentional plagiarism. Unintentional plagiarism is a conventional issue, one that can be addressed through instruction on citation. Remember, you must cite your sources, even when paraphrasing. This applies equally to work produced by other writers or by AI writing software (such as ChatGPT). We will address citation practices in class, and if you need assistance beyond classroom instruction, please consult a handbook, set up an appointment with the Writing Studio, and/or speak with me. Intentional plagiarism, however, is a breach of trust and integrity, a violation of the atmosphere of scholarship we work hard to establish and maintain at the University. If the instructor verifies an act of academic dishonesty has occurred, the Director of Campus Writing, Dean, Provost, and Dean of Students will each be notified. Depending on the severity, plagiarism can result in failing an assignment or failing the course.”

Other Resources

“AI Literacy for Faculty” Subject Guide from SPU Library

MLA-CCCC Joint Task Force on Writing and AI

AI, Faith, and the Future: An Interdisciplinary Approach, co-edited by Michael J. Paulus Jr. and Michael D. Langford. (eBook; requires SPU log-in)

Anna Mills and Lauren E. Goodlan, “Adapting College Writing for the Age of Large Language Models Such As ChatGPT: Some Next Steps for Educators”

Emily Bender, “ChatGP-why: When, if ever, is synthetic text safe, appropriate, and desirable?”