Skip to Content, Navigation, or Footer.
Logo of The Middlebury Campus
Thursday, Nov 21, 2024

AI chatbots are changing Middlebury’s academic landscape. How should we respond?

“I have neither given nor received unauthorized aid on this assignment.” 

Every Middlebury student knows this phrase by heart, as our honor code is a staple of the Middlebury community. When students take accountability for the integrity of their work, professors are able to trust their students. When professors trust their students, students get to learn without feeling policed. This sense of trust is a core tenet of Middlebury’s academic culture, and one that Middlebury loves to tout on its website and tours. 

However, the landscape of information and tools that are available to students is rapidly changing. This has led many to ask the question: how will ChatGPT impact the honor code? 

If you don’t know, ChatGPT is a free AI chatbot, and it’s quickly become the fastest-growing internet app. It can respond to complex prompts in a matter of seconds, mimicking a variety of  stylistic tones. It can quickly write copy, edit text or summarize dense information. While the technology has its limits and drawbacks, it’s also incredibly powerful, which is evident in the fact that it can generate text that sounds remarkably human. 

For the pessimistic, AI chatbots like ChatGPT may seem like the end of academic honesty. Why would anyone spend weeks writing a research paper when they could just plug the prompt into ChatGPT and have it ready to go in under a minute?

Especially for those in academia, it’s easy to view ChatGPT as a threat, but the reality is a little more complicated. In fact, we can’t quite be sure what role AI chatbots are going to play in the future, or what exactly they’re capable of. As such, it’s important for everyone to exercise caution in how we engage with them.

What’s going on?

As ChatGPT has begun to permeate Middlebury’s academic world, professors have had varying reactions. Some have banned all use of ChatGPT, and some have even gone as far as to employ software to detect text that may have been generated by AI. Others have actively encouraged the use of AI chatbots in research, as long as that use is fully disclosed. Others have not acknowledged the technology at all. Policies tend to vary widely between departments as well. 

While everyone seems to be in agreement that having a chatbot write your entire paper for you violates the spirit of the honor code, there's no unified stance beyond that.

That inconsistency leaves students unsure of how to use the chatbot, if at all. While the honor code has always, by design, been somewhat open to interpretation, this inconsistency seems particularly difficult to navigate. What if a professor hasn’t mentioned ChatGPT at all? Should students assume that its use is prohibited? Should they be punished if they guess incorrectly? 

In some ways, it’s a question of equity. Not all students come to Middlebury with the same level of experience in writing or research. An AI chatbot could be an equalizer in that sense: students could use it to help them hone in on the perfect “academic tone” or to make research less intimidating. 

However, AI can also be quite inequitable. One key concern is that the bots themselves may be biased. Because they’re trained on existing human information, they can inadvertently adopt biases from their source material. There’s also a question of access. While ChatGPT is free right now, later generations of the tool might not be. It already has a paid subscription tier.

So here’s the situation in a nutshell: nobody is sure what to do with this new technology, and students and professors alike are floundering as they try to feel their way through uncharted territory.

What should we do?

First and foremost, we should preserve Middlebury’s culture of academic trust. That goes for professors and students.

Students have a responsibility to maintain the trust they’ve built with their professors. Just because ChatGPT is easy to use doesn’t mean we should use it at every opportunity. We’re all at Middlebury to learn, and we all know how to recognize when we’re meeting that goal.

ChatGPT can easily be a tool to enhance learning. It can also easily serve to avoid learning. Students are going to need to recognize when they’re skimping on their education and when they’re not. It’s that simple. There are many ways to use ChatGPT, and it will often be up to us to judge which of those uses are academically honest.

Professors, for their part, have a responsibility to work with students and maintain the spirit of the honor code. Suddenly ramping up academic regulations or greeting papers with suspicion shows a lack of trust. If we believe in the honor code, then professors should assume that students will use ChatGPT, or not, in accordance with the guidelines they’re given and their own best judgment.

Enjoy what you're reading? Get content from The Middlebury Campus delivered to your inbox

It’s also important to remember that AI chatbots are more than a big bad wolf. They are exciting new tools, and they will almost certainly affect the professional world that Middlebury students find upon graduating. While prioritizing academic integrity, the Middlebury community should also make sure that students graduate knowing how to use AI effectively. We’re going to need to.

How should we regulate?

Currently, we do not believe that a sweeping policy against AI chatbots at Middlebury would be effective or useful. We don’t know enough to make absolute statements, so for now, professors should have space to make their own policies about AI. On the flip side, students should be given the benefit of the doubt when the rules get murky. 

As we’re all learning the academic realities of life with ChatGPT, it’s important that we meet each other with grace and patience. Professors are going to need to troubleshoot AI policies, and students are going to need to figure out appropriate boundaries for using the technology. 

Everyone at Middlebury values education. That’s why we’re here. We just need to keep trusting each other.



Comments