Navigating AI in Education

Image
Sandiway Fong

Q&A with Sandiway Fong, director of the University of Arizona Human Language Technology Program.

Many professors now ask students to sign an agreement not to use tools like ChatGPT for schoolwork. Is that appropriate?

Naturally students want to be able to look up concepts and class material. They’ve been doing that already, with Google. It’s just that AI is much quicker. It’s a question-answering system, which has actually been a holy grail of the search engine industry for many, many years.

Should schools pursue ways to recognize when students use AI?

Recently, seven U.S. companies in AI made an agreement with the government to explore a watermarking system, so people can know when content is generated by AI. But students understand that. They’re not afraid to cut and paste and change a word or two to get around that. We have to be realistic. This genie is out of the bottle. I think it would be very naive of us to expect students not to use it, even if you make them promise not to.

Given that, how should educators respond to this technology?

In my courses, I tell students, ‘Use anything you like. I don’t care whether that’s an extraterrestrial, an AI, your friend…’ Because my assignments are structured so that students have to think. I think that should be the ultimate criterion, which puts the onus on professors to design work that can’t be accomplished with copy
and paste. And I think this is a potential upside: professors could do better, and students could actually end up learning more.

 

Data Connects Us Magazine