Join Us Monday, August 25

You’re a student who spent an entire semester researching and writing a 20-page paper. You’ve poured time and effort into the assignment, and you’re looking forward to hearing your professor’s feedback.

Instead, you get a mediocre grade and three short paragraphs of vague comments, and you wonder: Did ChatGPT grade my essay?

Turns out, it did.

That’s a scenario a student recounted to Alex Green, an author and professor at Harvard’s Kennedy School. Green told Business Insider that the “AI evangelism” push — efforts to use AI across classrooms to make both teaching and learning easier — is doing more harm than good, undermining critical relationships between teachers and students.

Whether teachers or students are using AI, Green said it’s leading to a loss of “so many fundamental communication skills,” like knowledge and reasoning.

Green, who teaches policy communications and op-ed writing, said AI could also harm his students’ career prospects if they’re pursuing fields like communications and rely on AI to build those skills.

“My job, in part, is to help prepare them to go get jobs,” Green said. He added that he heard from some of his students that their prospective employers required them to share their screens while they take writing tests to ensure they’re not using AI.

“And so what would I be doing for them if I said to them, ‘No, no, you can just use these indiscriminately, and how you write and how you think and how you synthesize ideas doesn’t really matter?'” Green said.

Over the past decade, tech leaders and educators have been pushing initiatives to incorporate AI in classrooms. While some surveys have shown that AI use has helped teachers save time and provide higher-quality lessons, there’s minimal evidence that using AI to learn is effective. Additionally, AI is already starting to impact young people’s job prospects, with some tech leaders saying that it will decrease white-collar job openings.

Green said he’s not against AI — he has used it himself for his work, and he allows it in his classroom, to an extent. But it’s not a replacement for teachers, and heavily relying on it is a waste of a school’s resources, he said.

“You’re here now, and you’re in a class, and you have someone who is a total nerd for this and has devoted their life to every aspect of this. And you have me fully at every moment of every hour for the next eight weeks and beyond,” Green said. “Why on earth would you take all of that sacrifice and all of that dedication and give that over to a machine?”

‘The bible salesman version of AI’

There’s no shortage of efforts to incorporate AI in education. Take Khan Academy — the online tutoring organization established in 2008, which gradually started using AI to create lessons that personalized students’ experiences.

Khan Academy continues to enroll students, but other efforts have failed. AltSchool, which was backed by tech billionaires including Mark Zuckerberg, began to shutter four years after opening in 2013, as parents saw that their kids weren’t excelling using technology-based education.

Green said the problem is that many of these initiatives are focused on making learning as easy as possible, and that shouldn’t be the goal.

“These people have reframed the idea of learning as something where any struggle to wrestle with a concept or think really hard about something is a sign that the education is bad, and that what we need is for things to be as seamless and easy as possible,” Green said.

That’s not to say there isn’t a place for AI. Green said that he used a large language model, or LLM, to comb through materials for his research and found it helpful. In his classroom, he said that after five weeks of “intensive non-technological use,” he starts incorporating AI to help his students prepare for the political communications landscape, which includes dealing with chatbots and identifying falsely generated images.

Some colleges are putting AI at the forefront. In February, California State University announced its initiative to become the “nation’s first and largest AI-empowered university system” through public-private partnerships to train students and teachers on AI technology, including offering all students and faculty access to a version of ChatGPT.

On the federal level, the Trump administration is establishing a task force to promote AI in K-12 classrooms and look into redirecting funds toward AI efforts.

Some critics have warned that the US should tread carefully. South Korea recently rolled back its initiative to place AI textbooks in classrooms due to backlash from parents and teachers over a lack of preparation on how to best use the tech.

Green said that if colleges want to adopt AI in classrooms, they should mandate intensive training for faculty to understand “the very serious risks” the technology poses to learning. He also suggested restrictions on how teachers can use the technology, including using it to grade and communicate with students.

“We need actual committed educators who are not the bible salesman version of AI at the front of the room, opening up space for ideas about its judicious use in the classroom,” Green said. “We could really end up with some incredible amounts of junk here, and at the expense of our young people actually learning skills that you do need in the real world.”



Read the full article here

Share.
Leave A Reply

Exit mobile version