A Career Tool? A Dishonor? College Students And Educators … – LAist

LAist is part of Southern California Public Radio, a member-supported public media network. For the latest national news from NPR and our live radio broadcast, visit LAist.com/radio
Share This
Nearly one year after ChatGPT’s arrival and with the availability of other generative artificial intelligence, faculty and students are adjusting to a new reality.
LAist asked community college students, faculty, and staff about their experiences with generative AI.
Several students, faculty and staff stopped to share with LAist their opinions on AI technology.
Here is what five of them had to say.

Tony is starting his second year at East Los Angeles College. (LAist agreed not to use his last name.)
Faculty often speculate why students use AI for their assignments. Tony said he has not used ChatGPT much, but he did for an English class.
“I actually had to write (about) an old book that I, one, didn’t really care about, and two, it was kind of boring. So I used ChatGPT for that mostly as like, I don’t get it, I don’t understand the meaning,” Tony said.
ChatGPT not only helped him with the content of the book, it helped instruct Tony how to structure an essay.
“I use ChatGPT for sort of like an idea of how the essay is supposed to look like ’cause I struggle writing my own. I can write my own words and whatnot (but) it’s like, how it’s supposed to look?” Tony said.
ChatGPT’s output wasn’t perfect, though. It didn’t explain the characters’ emotions well, so Tony came up with that part on his own.
ChatGPT would not have worked for another class he was taking on music. At a minimum, it would not have used the technical terms his professor covered in class, Tony said. But also: Tony wanted to write the music paper.
“I was able to excel at it because I was able to write whatever I wanted to. I could write (about) like a concert, music, whatever,” Tony said. While he describes his writing as average or even below average, he said that for the music paper, “you will see that I did put effort into it because I finally, you know, actually enjoyed the topic that I’m writing (about).”
LAist asked ChatGPT for help with developing questions to ask community college students about using ChatGPT. Here’s what ChatGPT suggested:
LAist asked Tony if he was concerned about relying too heavily on AI for his education. (This question actually came from… ChatGPT).
“Oh yeah,” Tony said. “If a person used too much ChatGPT or AI as a whole, they wouldn’t really learn the skills for what they’re doing.”
Tony also used ChatGPT for some math, which he said isn’t his strong suit. He said he felt OK about using it since he didn’t consider it necessary for the profession he is interested in, which is becoming a therapist.
Tony said that generative AI could be helpful for community colleges and that it’s not as harmful as some might think.
“I’m more like thinking it would be a better idea to like add [artificial intelligence] into community college as like a little learning tool,” Tony said. “So instead of like, you know, community colleges or [a] university, trying to be like, ‘No ChatGPT is bad,’ fully, which is kind of a lie — it’s kind of helpful, like there’s some good. Yes, there’s some negatives and positives. That’s like almost anything.”

East Los Angeles College student Eider Martinez is also beginning his second year. He has completely avoided using ChatGPT. For him, it’s a danger to independence, creativity, and the gains earned through personal struggle.
“We’re entering this age where people are like relying on other things to like create work and art. And I just don’t like that,” Martinez said.
Martinez was upset that the Marvel television series Secret Invasion, used AI for the opening credits. He thought that took away an opportunity for an aspiring artist to earn income and who could have done an even better job.
“What’s the point of writing? What’s the point of trying? What’s the point of doing rough drafts or doing anything when you could just throw in a couple words and some program does it for you? There’ll be no more creativity. It’s gonna be a very bleak world.”
For Martinez, without AI he can be assured that his work is his own and he can learn from his own mistakes.
“I’m so glad I’m outta high school when the ChatGPT barely started ’cause now I know that all my work is at least me. This grade, I deserve it. I did it. It’s all on me,” Martinez said. “And that allows me to get better. And I like that. I like self-improvement.”
“But when you have a generation of people who are relying on some program, then they’re never gonna self-improve,” he added. “They’re gonna stay the same way.”

Rohan Desai is the department chair for counseling at Pasadena City College, a counselor himself, and a coordinator for the Men of Color program on campus. He sees that AI has the potential to help students navigate the process of applying to college or selecting a career.
“A lot of our students are the first in their family to attend college. Some of them are first generation professionals,” Desai said, noting that professional work often requires applying through a cover letter, for instance.
He said students can use AI to list the steps of the career decision making process or to show them how to write a personal statement, by submitting their résumé, classes, work experience, and GPA.
“I mean obviously we tell them don’t submit, like write it in your own words,” Desai said. “But this can at least help you identify what it should or could look like because oftentimes students have never written a paper like this. They have no idea what to expect.”
Desai also considers AI tools as a resource for general information that students can access off hours. For example, a student may want to ask AI about information on careers in hospitality. He said that while the counseling department offers workshops and many online resources to its students, “you know, we’re not here 24/7.”
“Especially our historically marginalized lower income students, they have a job, two jobs, outside of campus, they’ve got families to attend to, so it just kind of helps as maybe outside of our normal business hours resource, but it’s not an end all be all,” Desai said, noting that a lot of students also need to be met with human emotion and empathy, which ChatGPT can’t replicate. “I think it’s just an additional support.”

At East Los Angeles College, Michelle Priest teaches anatomy, physiology, and microbiology, and is wrestling with the implications of generative AI.
Priest found that students were generating AI responses for their lab assignments because these assignments referenced lab tests that were not covered in class. Over the phone, Priest walked LAist through how Google’s generative AI, Bard, would write a detailed report on the types of microbiology tests to discover e.coli. Bard listed certain tests, Priest said, including one that would have been too expensive to cover in class.
Whereas other disciplines, such as English or social science, might value students’ subjective opinions on a topic, Priest considers that because answers in her classes are so “absolute,” detecting AI usage is an even greater challenge.
Pasadena City College Writing Center Faculty Giselle Miralles and Genesis Montalvo recommended strategies for college faculty trying to integrate generative AI into their classrooms.

To prevent copying from generative AI, Priest came up with an idea: requiring students to submit their lab assignments in their own handwriting. Even digital handwriting was acceptable. For Priest, this technique gets students to do their own work.
Priest is also relieved that labs are in-person again, where she’s seeing less reliance on generative AI. Since students have returned from the pandemic to taking classes in person, Priest has seen exam scores deflating. Priest has asked students to respond in an anonymous survey how often they’ve cheated, and they’ve responded that they do 70% of the exams on their own and the remaining based on searching the internet or using AI.
The classes Priest teaches are challenging and prerequisites for those going into the healthcare industry to become radiologists, nurses, and dental hygienists. She’s hearing from friends who teach in those programs that students are lacking the basic skills that they should know.
“The consequence is they’re getting into the nursing programs and they’re failing out of the nursing programs, the dental hygienist programs… My friends who are in those programs literally are just lamenting about how these students don’t know the basics and they’re having to remediate all this information that they should have known,” Priest said.

Elizabeth Ortega, a professor in sociology at East Los Angeles College, noticed students putting ChatGPT output in their responses for certain assignments. Whereas she had a response before for plagiarism in her syllabus, Ortega considered ChatGPT output technically different since it is not necessarily copyrighted. It was an issue that she had never experienced before that she needed to address.
Before the start of the fall semester, Ortega initiated a staff development training for other faculty, attended by several dozen colleagues. She was motivated to discuss strategies around reducing ChatGPT usage, to hold students accountable to learning.
“I want to make sure that no one slips through the cracks because, yes, it is easier to just have an AI technology do the work for you if you’re a student that doesn’t have time, and has the job and has family responsibilities,” she said. “But if we allow that, if we don’t try our best to create assignments that allow them to practice these skills or hone these skills that they don’t have yet, or they need to develop more, we’re not serving anybody. We’re not serving the community.”
Students need to learn the skills for critical thinking, writing, and analysis, Ortega says.
She is concerned about how harmful it can be for students at East Los Angeles College who are underserved and under-resourced. Ortega recommends various strategies for faculty to encourage students to go beyond using AI for their assignments, such as asking them to draw from their personal experiences.
“Like how do you think the role of institutions has influenced your perspective on life? And how do you think that influence like that has brought you to where you are right now and influenced your decisions in some way?” Ortega said. “And so that cannot be answered by the AI model because it doesn’t have feelings and experiences, so it can’t, you know, make anything personal.”
Although reluctant to draw further attention to ChatGPT, Ortega cited the saying, if you can’t beat them, join them. A strategy she provided to East Los Angeles College faculty in her training is to generate a response from ChatGPT to a prompt and get students to identify where ChatGPT falls short.
“How would you improve it? What is it missing? You know, what specific information can you add to it to make sure that this topic is being talked about in a well-rounded way?” Ortega said. “And so, you know, using it and saying, ‘hey, we all know it’s here, but you’re smarter than that and you guys can actually critique certain aspects of this AI model.’”

LAist is part of Southern California Public Radio, a member-supported public media network. For the latest national news from NPR and our live radio broadcast, visit LAist.com/radio

source

Jesse
https://playwithchatgtp.com