Faculty 'cautiously optimistic' about the potential of generative AI – Virginia Tech

19 Sep 2023
Editor’s note: The image generated for this story came from a source of artificial intelligence to illustrate how the technology can be used.

Before they change the world, revolutionary new technologies often cause a bit of panic.
The telegraph. The calculator. The internet. All engendered some societal upheaval before they were fully embraced.
Now generative artificial intelligence (AI) has joined that list, sending shockwaves through higher education since the November 2022 release of ChatGPT. 
If anything drove initial concern about generative AI chatbots among faculty, it was the idea that they were a nearly perfect cheating machine, able to write an original essay, ace a quiz, even pass a Ph.D. qualifying exam without leaving any clear signs that “AI was here.” Faculty fretted that AI would turn traditional learning at Virginia Tech upside down. 
As the university enters its first full post-AI school year, however, alarm may be softening to cautious optimism about its promise as a tool for teaching and research.
“The national conversation has been, ‘What is this? Is this going to replace my job?’” said Cayce Myers, professor of public relations in the School of Communication. “If we can get past that, then we can get into some more nuanced conversations about, ‘How can this enhance my work and my students’ educational experience?’”
In his spring advertising ethics course, Myers asked ChatGPT to create 10 taglines for a made-up potato chip company. “It did a really good job,” he said.

That prompted an important discussion among his students: Where do these tools fit into their discipline? What use of AI tools is OK? Students agreed that disclosure was necessary, but would it always be? “That was kind of the point of the exercise,” said Myers. “In early adoption of a technology, how do you navigate those contours?”
Initially, navigating the contours of AI has meant figuring out how to ward off its misuse. Faculty have rallied around a few practical ideas, such as developing a syllabus statement with clear AI guidelines, teaching students to document when and how they use AI, adding extra security to Canvas quizzes, or pointing to the Undergraduate Honor Code’s statements on plagiarism.

Yet a laser focus on catching cheaters likely misses the point. Dale Pike, associate vice provost of technology-enhanced learning, acknowledged that AI detection software has proved unreliable so far, but added that universally barring use of AI will leave students unprepared for the workforce. “Ultimately, I think what has to happen is we have to rethink student learning assessment, wholesale,” he said.
That rethinking, for some faculty, involves replacing gameable assignments based on memorizing and summarizing with assignments involving problem-solving, in-class creation, critical thinking, and collaboration.

Beyond that, faculty are considering how AI models such as ChatGPT can customize learning by producing dynamic case studies or offering instant feedback or follow-up questions. “It could be emergent and responsive in a way that one human never could,” said Jacob Grohs, associate professor of engineering education in the College of Engineering. “It really ups the ante in terms of what we need to be doing as teachers.” 

In a first-year engineering course Andrew Katz taught last semester, the assistant professor of engineering education had ChatGPT explain foundational engineering concepts with different audiences in mind — a first-grader, a high schooler, an undergraduate. Then, his students identified baseline pieces of information amid the varying layers of complexity. “I’ll continue to encourage students to use these tools this fall,” he said. “So then the biggest question is, How do you help students use them thoughtfully?”
One use he’s particularly hopeful about is AI’s potential as an intelligent tutoring system that can individualize education by using students’ interests to teach new information — for instance, offering soccer metaphors to teach a new concept to a soccer-playing student. “If you can take even a step in that direction that’s a big improvement,” said Katz.
For now, many faculty are making AI the subject of assignments. They’re asking students to analyze and identify weaknesses in arguments produced by ChatGPT, for instance, or to edit an AI-produced essay with “track changes” on.
That kind of critical thinking about generative AI is vital, said Ismini Lourentzou, assistant professor of computer science in the College of Engineering. “It’s our responsibility as educators to teach students how to use these tools responsibly, and then understand the limitations of these tools.”
AI’s limitations are, admittedly, worrisome.
Lourentzou, who has long worked at the intersection of machine learning, artificial intelligence, and data science, recently collaborated on a commentary published in the biomedical journal eBioMedicine pointing out how AI models amplify pre-existing health care inequities for the already marginalized. 
Junghwan Kim, assistant professor of geography, in the College of Natural Resources and Environment, published a research paper in the journal Findings about potential geographic biases in a generative AI chatbot’s presentation of problems and solutions related to transportation in the United States and Canada. 
For students to develop digital literacy around AI, they must understand its flaws, including bias, hallucinations, privacy concerns, and issues of intellectual property. Such problems aren’t necessarily a dealbreaker, as long as students learn about them. “I’m a little concerned,” Kim said. “But my argument is, let’s be aware of the capabilities and limitations and then use it wisely.” 
As faculty navigate the challenges of generative AI, working groups have cropped up around the university, including those sponsored by Technology-enhanced Learning and Online Strategies (TLOS) and the College of Liberal Arts and Human Sciences, to discuss challenges, opportunities, and discipline-specific norms, which may vary widely for engineers and artists, writers and scientists. 

Other university resources and responses include the following:
The conversation, nationally and at Virginia Tech, will continue for the foreseeable future. In the meantime, Pike urges faculty members to spend time experimenting with the technology. “I think the most important thing right now is that everyone — faculty, students, staff, administration — should be dabbling in this and getting their head wrapped around what’s different about this and how it might be helpful.”

At the very least, everyone needs to be paying attention, said Pike. “I don’t think this is hype. There are a lot of very smart people who are saying, ‘This is revolutionary.’”
Dave Guerin
540-231-0871
Virginia Tech demonstrates impact as a global land grant – progressing sustainability in our community, through the Commonwealth of Virginia, and around the world.
Get Directions 
See All Locations 
Contact Virginia Tech 
For the media
© 2023 Virginia Polytechnic Institute and State University. All rights reserved.

source

Jesse
https://playwithchatgtp.com