Virginia Tech faculty weigh in on ChatGPT and other generative AI – Virginia Tech Collegiate Times
A mainly sunny sky. High 71F. Winds light and variable..
Clear. Low around 40F. Winds light and variable.
Updated: October 24, 2023 @ 2:29 am
We’re always interested in hearing about news in our community. Let us know what’s going on!
If you’re interested in submitting a Letter to the Editor, click here.
A man from Blacksburg, Benjamin Fields, was arrested after allegedly assaulting four individuals close to the 460 underpass on the Huckleberry Trail, according to WSLS 10.
In this year’s rendition of the Commonwealth Clash, Virginia’s men’s soccer team dominated Virginia Tech on the road and came out with a 3–0 win.
Since the university’s creation, the College of Engineering at Virginia Tech has educated and graduated thousands of students who make major advancements in their fields. While the phrase “Once a Hokie, always a Hokie” keeps alumni connected to their alma mater, their life after graduation i…
The grind for college students can be a rough one, with mounds of work to do and deadlines hanging over their heads. The time crunch is real, as deadlines are deadlines, and the work must be turned in on time so professors can then review the work and grade it accordingly. However, it doesn’…
Search salaries of employees from Virginia Tech and other universities dating back to 2007.
ChatGPT created an article on the ethics of AI usage in college classrooms, Oct. 1, 2023.
ChatGPT created an article on the ethics of AI usage in college classrooms, Oct. 1, 2023.
Generative artificial intelligence has gained notoriety over the past year — particularly ChatGPT — which Reuters reported broke the record for the fastest-growing consumer application software in history this past February. ChatGPT can perform different tasks, such as generating writing and responding to questions.
For the first time this fall, various professors have outlined policies on syllabi regarding the use of generative AI in classes, such as David McPherson, a professor in computer science. He said that the university requested faculty to include information about generative artificial intelligence on syllabi.
“Yeah, they’re on the syllabus,” McPherson said. “I think we use more or less the boilerplate from Tech. They’ve got some language for generative AI, either allowing (or) disallowing, completely banning, so we use sort of the generic ‘Don’t use it unless we say it’s okay to use it.’”
Technology-enhanced Learning and Online Strategies specifically recommends “a measured approach” and lays out suggestions for faculty to follow, including becoming familiar with AI to be better informed, establishing expectations with students about using AI and exploring the changes to the design of the course and assessments.
TLOS also recommends that professors consider the Honor Code and how it applies to AI.
“While most students largely engage in honest behavior in the classroom, some may choose to use tools such as ChatGPT to engage in academic dishonesty,” The Office of Undergraduate Academic Integrity said, according to TLOS. “Please continue to be clear in your expectations with your student related to the Undergraduate Honor Code and the use of AI software just as you would other websites that may provide students with means to engage in academic dishonesty. The unauthorized use of ChatGPT and other AI software may fall under several definitions of academic dishonesty in the Undergraduate Honor Code.”
McPherson said that he could use his judgment to determine if a student used AI to generate a code, as it would look odd. However, it wouldn’t be grounds for an Honor Code violation.
“I personally wouldn’t be able to say with 100% accuracy like this was written by ChatGPT or something else,” McPherson said. “I might just think this is really weird looking code, and then maybe just keep an eye on that person and see what they do or talk to them. We certainly have done that where if a student maybe is appearing to struggle with understanding concepts, and then all of a sudden he comes to us with this really, really cool code. And we’re like, ‘How did you go from not knowing the basics to writing this really cool stuff? Like, where did it come from and how? Explain the code.’ And sometimes they can. Maybe they really buckled down and figured out how to do this. Or maybe they hired someone, (but) we try to talk to them in that case before we do anything.”
Alice Jang, an assistant professor in business information technology, uses AI in her course. Jang uses Packback for discussion board posts, which the software will grade using an AI algorithm. Jang also encourages the use of ChatGPT for her students, especially when they need to understand difficult concepts. However, students are not allowed to use it when completing assignments in class.
“If you were to write up front in ChatGPT, ‘What is linear programming?’ And in addition to that like, ‘Okay, explain this using (an) example that a college student without stats knowledge or who hasn’t taken any optimization class (would) understand this,’ then it does a really good job trying to explain at their level. So I strongly encourage them to use it,” Jang said.
Jang said there’s a low possibility of students misusing ChatGPT in her class because it is a language model, that is it determines word probability by analyzing text data. In her class, Jang teaches statistical and optimization models, which according to her is not related to language models.
Faculty Senate President and chemistry professor Joseph Merola doesn’t have a policy on ChatGPT or other AI for his students because he doesn’t see how his students could use AI to their advantage in his classes. However, students in one of his classes will be assigned an essay at the end of the semester.
“So even though I haven’t developed a policy right now, I will probably give them guidance at the end, as they’re writing it to tell them what they should and should not do with ChatGPT,” Merola said. “And to be honest with you, since it’s still something that I am developing in my mind, I’m not really sure what I’m going to say.”
On creative writing professor Matthew Vollmer’s syllabi, he tells students that AI generators are permitted for brainstorming, research and feedback.
“And if they use it, just tell me because I’m curious about how they might be using it. My sense is that they’re not using it at all, and it seems like a lot of them are afraid of using it,” Vollmer said.
Vollmer said that using AI to generate ideas doesn’t undermine a writer’s ability because writers have always used resources for inspiration.
“Let’s say you have a story, and there’s a setting in a jazz club in New York City,” Vollmer said. “So you look at Google Images, jazz rock, and then that’s the way it’s set up. Describe what I see in that picture and you use it. I mean, writers use search engines and the internet to look up all kinds of things, so I don’t see it as being much different than that.”
Vollmer also said that AI does not always generate the best ideas, which could lead a writer to using it as inspiration to come up with better ideas.
Students in Vollmer’s classes are only refrained from using AI to impersonate themselves when writing. Similar to McPherson’s take on AI-generating code, Vollmer said it could also be obvious when AI generates writing, but he wouldn’t know with full certainty.
Ultimately, Vollmer doesn’t think AI will replace writers.
“It doesn’t function on its own. It requires prompting,” Vollmer said. “So there’s always going to be a human element (to writers). Will it change the way we write? Absolutely. But in thinking about ways to prompt (it) to do things that require an output that is any way good, you’re gonna need imagination and intuition — things that are human in nature.”
Vollmer is also heading a task force within the English department. According to him, Kelly Pender, the department chair, decided to create the task force to assess the challenges and opportunities of working with AI when it is developing quicker than people realize. Since AI is a language model, faculty constantly receives updates about what AI is capable of and how students may use or misuse it. The task force is a way to discuss how to approach and think about these issues.
Aside from Vollmer, Jennifer Lawrence, director of the Writing Center; Julie Mengert, director of the University Writing Program; Jimmy Ivory, English professor; and Avery Wiscomb, English professor are also on the task force. Ivory and Wiscomb’s research involves technology and computing, according to Vollmer.
Dan Dunlap, a computer science professor focusing on ethics in computing and technology, compared the rise of AI to industrialization. He said that despite fears that machines would replace people, Dunlap thinks the opposite has been true. However, there are still unknowns surrounding AI.
“It’s transformed different kinds of work, different kinds of labor, and certainly (AI) is going to be the case here,” Dunlap said. “We can’t anticipate what that’s going to be and there’s going to be a lot of unintended consequences, and how AI and large language models and all this stuff ends up replacing, it’s already doing. I guess one fear is that it’s doing a lot more than we think it is. We’re reading things that we think (are) purely human written and in many cases, there’s generative AI involved somewhere in much of what we consume, and not just reading, but recommendations and all those other things that these models are becoming involved in.”
Merola believes that in order for people to use AI, the software itself needs to be accustomed to humans.
“Because in order to effectively use AI, we need to make sure that the programs are in tune with our humanity, and therefore the humanities,” Merola said. “And the arts have to be a big part of this. If we’re going to use it to create art, maybe that seems obvious that we need to have artists and humanists, but as well if something is going to try to understand a human reaction to something — well it better be programmed with understanding that human reaction. And so it’s not just for scientists, it’s for the whole gamut of the human experience.”
Your comment has been submitted.
Reported
There was a problem reporting this.
Log In
Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.
Success! An email has been sent to with a link to confirm list signup.
Error! There was an error processing your request.
Your browser is out of date and potentially vulnerable to security risks.
We recommend switching to one of the following browsers: