Skip to Content, Navigation, or Footer.
The Rotunda Online
The Rotunda
Wednesday, March 12, 2025

Longwood Faculty Discuss AI Policy

7f6f71fbd83f58ea0481163a5ab62925
Grainger Hall

On August 10, 2023, Dr. David Magill, chair of the Longwood English department, and Dr. Sean Barry, an associate professor for the Longwood English department, led a faculty development session on teaching with artificial intelligence. Magill said the purpose of the development session was to go over what this technology is, how it functions, and how it affects the teaching of writing. 

Magill and Barry attended a program sponsored by the MLA over the summer, which had dedicated sessions on AI by experts in the field including Annette Vee, associate English professor at the University of Pittsburgh and author of Coding Literacy: How Computer Programming is Changing Writing. 

Magill said, “The conversations we were in this summer helped us understand that we needed to have a conversation with our colleagues about if you are teaching writing, these are the kind of strategies that will help and were being posed by the MLA.” 

One of the considerations about ChatGPT in classrooms is students using it as a study tool. Magill said, “What we know about ChatGPT and other generative AI is that they don’t always get it right, so you could be studying something that has many factual errors.” 

But it doesn’t stop with inaccurate information. ChatGPT responses can be biased, due to where the program takes content. Barry said generative AI responses will reflect the bias from the content that ChatGPT steals. This content could be racist, sexist, or classist and have a linguistic bias.

Barry said, “If you are a student or scholar relying on ChatGPT to answer questions, to summarize text, to explain historical events or concepts, is almost to court, directly reproducing the bias that is already present in the unreviewed corpus of material that GPT is drawn from.” 

The material ChatGPT presents can be information on niche subjects or an entire essay. This brings up plagiarism and the Longwood Honor Code as a topic in the discussion about AI use. 

Dr. R. Adam Franssen, associate professor of biology and associate director for curriculum design for the Center for Faculty Enrichment (CAFE), helped lead a Teaching with AI presentation in a Faculty Senate Meeting on August 31. He said, “AI is already kind of covered in the Honor Code as a violation.” 

Franssen said using AI on assignments and exams can be considered plagiarism since AI tools like ChatGPT take information from other sites. “If a student used something [ChatGPT or other AI apps] and submitted it as their own work, well it’s not their own work, which is considered a violation.” 

While using generative AI without permission from educators violates Longwood's Honor Code, there is some debate as to whether artificial intelligence is practical in specific fields, classes, and departments. 

Barry said, “It's a continuing conversation, and it's a conversation that involves faculty talking to other faculty to ensure that we understand what the technology does and does not do … to ensure that we're cultivating digital literacy as part of our mission to help students be informed readers, that includes readers of the outputs of generative AI.” 

Educating students to be informed readers on generative AI questions whether or not Longwoods Honor Code should be altered to accommodate AI use. 

Magill said, “We have to have a policy that is clear but is also able to be changed, or able to adapt to the fact that this technology will continue to change over time and continue to show up.”

Dr. Barry added, “I regard the Honor Code first and foremost as an educational instrument, so if it is unclear to a student whether or not the Honor Code applies to the use of generative AI, then it seems to me we owe it to one another to revise the Honor Code, but that is my opinion as an individual instructor not as an authority over the Honor Code.” 

Despite faculty discussion over whether The Longwood Honor Code should recognize artificial intelligence, Longwood’s faculty is beginning to prepare for AI to show up in their classrooms. 

Ashley Leslie, Director and Instructional Designer for Longwood Digital Education Collaborative Department (DEC), said in the Teaching with AI presentation, “CAFE and DEC will be continuing to hold workshops focusing on AI.” 

Leslie said these workshops will guide faculty into making assignments AI-proof and updating faculty about AI technologies. Workshops will be in the fall, including one in November that help professors write their final exams to deter AI use. Leslie also informed faculty senate members of AI apps that detect the use of ChatGPT and other AI tools, including Turn-it-in and Chat-GPT Zero. 

Although Longwood is educating faculty in artificial intelligence use, if AI is implemented too heavily into courses and used as an education method, problems may arise. 

Magill said, “In terms of writing, from the MLA, our recommendation is clear that using AI has detrimental effects to learning writing, so we’re advocating that it’s not a good practice to allow for writing specifically.” 

In the MLA Joint Task Force on Writing and AI Working Paper conference summary, they said that students may miss writing, reading, and thinking practice and may not view these skills as valuable if they rely on AI.

But whether or not AI can teach students, these technologies are already being implemented in certain schools and areas across the U.S. 

Barry said, “I would ask what it tells you that these tools that are so ‘promising’ are being implemented in their usage for a population of students who are low income, students of color rather than affluent students who are the children of the people marketing these ‘wonder tools’.” 

These tools have already posed new questions for educators, students, and administration all over the U.S., but the recent boom in student use of generative AI is no shock. Magill said, “It's a new technology, but the things that are causing students to use it are not new.” 

Whether or not the education world finds generative AI useful and convenient, there will always be a concern if AI assists students in essential skills. 

Barry said, “They [AI tools] don’t write, they generate bodies of text from preexisting corpuses, and the difference between those two things is something that requires immersing yourself in the frequently frustrating human experience of writing.”