This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

(The Hill) – Panic has turned to preparation for professors and teachers as they look ahead to a new school year dealing with artificial intelligence.

Educators have had several months since the initial shock of ChatGPT and other AI bots that quickly infiltrated schools to reimagine classrooms and homework that fit the new age of technology. 

The preparation has taught educators two things: AI isn’t yet as smart as they thought, and they need to be more creative in their assignments. 

“There’s been a shift from being overwhelmed with what’s happening and trying to make sense of how to respond to all of the questions that they were getting, about ChatGPT, whether it’s cheating or jobs or what have you, and that has evolved into I would say an acceptance,”
Alex Kotran, co-founder and CEO of the AI Education Project, said. 

The initial panic among educators largely stemmed from cheating concerns. Teachers did not know what to do with tools such as ChatGPT that could seemingly write entire essays for students or complete their math homework. 

The backlash led some districts to ban ChatGPT completely from their school servers. In one college, a professor falsely accused an entire class of using AI bots to cheat on papers. 

Over the past seven months, educators have been able to learn that these bots, which expanded from ChatGPT and have been added to companies across a variety of industries, are not always right in their answers and have some telltale signs they are autogenerated.

“I think you can often tell if there’s too many flowery adjectives or if it makes up historical facts, right? I think now that we’ve all been using a little bit more, we’ve collectively gotten better at predicting the mistakes and it makes it like easier to detect when someone’s using it,” said Zarek Drozda, director of Data Science 4 Everyone. 

While catching the signs of AI-generated prompts might have gotten easier, teachers are also having to get more creative with their assignments. 

“I have seen more comfort with with online tools generally and more comfort with designing assignments, so it’s not about right or wrong or it’s not about something that can be replicated by ChatGPT, but it’s more about skills that intersect with it so analysis and interpretation,” Drozda said.

“There’s a difference between — you’re writing an essay generally about ‘The Scarlet Letter’ versus writing an essay about a particular prompt on a particular page with quotes that are real from the text, right? That can be pulled out by the teacher and, like, easily recognized, and so I think that there’s more comfort and also easier detection of scenarios in which you can spot cheating,” he added.

Educators have largely been on their own in this new realm of technology, and many have seized the opportunity.  

“There are very few things that have that will get most of academia to do some really focused thinking on one topic,” said Youngmoo Kim, director of Expressive and Creative Interactive Technologies Center at Drexel University, but he added that AI has done it. 

Kim, who also sits on a task force at Drexel regarding AI, says the school is preparing guidance for professors to send out in the new school year regarding ChatGPT and AI. The university will also offer workshops and individual sessions with faculty members who want more assistance on the topic.

The guidance is “not one extreme or the other,” Kim said.

“We’re not saying, ‘Oh, you can’t use it’ or ‘You must use it.’ That would be ultimately kind of silly. What we’re really strongly encouraging is for our faculty to become more knowledgeable about it, to do the specific things,” he added. “That is, if you have a standard set of homeworks that you always do or exam questions that you use often or recycle, put them in the ChatGPT see what you get. Do that legwork so that you as an instructor can have a much better familiarity of what you know, what it’s capable of and what to expect there.”

This open-ended approach is going to be a common theme among both colleges and K-12 schools as a one-size-fits-all solution is unlikely. 

“There’s an absence of clear directions from experts in the field,” Kotran said. “We’re in conversation with many of the organizations ourselves, we’re thinking about how do we put together guidelines and guidance for schools to reference but right now, there’s no high level report from the Department of Education,” or a similar organization.

While education institutions are often criticized for how slow it moves, especially on new topics, teachers have been quick to act on AI and learn even without standard guidance provided to them. 

Drozda attended a conference in North Carolina where teachers gathered to talk about data literacy and AI in the classroom. 

“It was a summer boot camp where we will have like longer conversations about these things and talk about them in more depth,” he said.

Recently, the AI Education Project teamed up with Prince George’s County Public Schools in Maryland to announce it will give AI education to teachers, students and school leaders, which will include professional learning opportunities and outreach programs. 

In the summer launch, more than 200 individuals showed up.

“We have been very pleased, but I’m surprised with the level of engagement and productivity on the part of educators and education leaders. We have not had a single meeting with a school district where it’s not clear that they urgently want to solve this problem, or this question. I think the challenge is we just don’t have enough time in the day to meet with all the schools that want to meet with us. And so we can’t do this alone,” Kotran said.