AUSTIN (KXAN) – Since the launch of OpenAI’s ChatGPT in November 2022, artificial intelligence, or AI, usage has skyrocketed. As a new academic year rapidly approaches for The University of Texas at Austin, faculty are working to balance fostering a learning environment with AI while preventing cheating among college students.

“I don’t think we can just say ‘don’t use it’ and assume students won’t use it,” said Ken Fleishmann, the founding chair of Good Systems. Good Systems is a campus-wide research effort at UT that aims to design and promote AI technologies that benefit society rather than harm it.

“We need to think about how we can incorporate AI into our teaching,” Fleishmann said.

The movement towards promoting AI learning has been a huge topic of discussion at UT, as the institution announced this past January its online masters AI Program set to begin in spring 2024.

Fleishmann, who will teach a class called Ethics in AI for the masters AI Program, said it is critical for students to familiarize themselves with AI.

AI as an ‘idea generator’

Impacted heavily by AI is the field of architecture. Program director of Architecture at UT’s School of Architecture Kory Bieg illustrated how easy it is to produce images for architectural design based off a text prompt.

Such imagery is created the use of AI platforms such as Stable Diffusion, Midjourney or DALL-E.

  • Futuristic Box-like looking building on a gloomy day
  • White building with lots of windows an open interior patio
  • Tall building in front of stairs and tree
  • Building full of windows with walking paths/ramps in front

“If you were to do it using a standard process of sketch, then to 3D model, and then to rendering, that would take months,” Bieg said. “Now you can do it in, like, 30 seconds.”

The way architecture students approach the design process now is much different from how it would have been a year ago.

“Our students are really using (AI) as an idea generator – like sketch tool essentially – where people can come up with ideas very quickly and render them through this software,” Bieg said.

The concept of AI as an idea generator is also appearing in other departments. Within the McCombs School of Business, Ben Bentzin, an assistant professor of instruction for Marketing, modified assignments to include AI heading into the fall 2023 semester.

“AI is going to take the place of some basic everyday thinking,” Bentzin said. “For students to be competitive, they’re going to need — take advantage of the basic thinking we can get from an AI and apply that in the real world.”

In the marketing class he teaches, Bentzin designed an assignment to gather information about customer likes and dislikes. In the previous fall 2022 semester, a student would merely brainstorm questions off the top of their head with the help of previous class material.

Now the assignment is changed to have students “share the objectives of your project with the AI and generate possible interview questions.”

Marketing Department Associate Professor of Instruction Stephan Walls said he’s also been thinking about utilizing AI to improve the curriculum.

“Most educators come back to the basics of Bloom’s Taxonomy to understand, remember or memorize more complicated learning objectives about synthesis and analysis,” Walls said. “AI tools can help us brainstorm a lot more and a lot bigger variety of learning objectives than what we can typically do just on our own.”

The concern of AI diminishing human creativity

The integration of AI in the classroom also has some professors worried it could affect the learning process for a student.

“If we all as human beings were only using AI tools to come up with answers, we would get less creative over time,” Walls said.

Dr. Samantha Shorey, an assistant professor within the Moody College of Communication, also echoes this sentiment.

“My big fear is that the skill that I’m hoping students develop — thinking creatively — (diminishes),” Shorey said. “It has the real potential for us as a species to stop developing these skills that have made us so beautifully human.”

While still being very excited for what AI can offer to society, Dr. Shorey emphasized the importance of where AI efforts should be focused.

“I would love AI to be doing tasks that put people in danger or diminish their humanity.” Shorey went on to say, “What is the problem, what can it solve? If there’s no problem there, then I don’t know if that’s something we should be leaning on.”

UT’s polices and adaptation towards AI

Vice Provost for Academic Affairs Art Markman says there have been no changes in academic policies even with AI now being a significant factor in today’s society.

“University policy has always said that as a student at the university, you are to turn in work that is your own work,” Markman said. “That (provision) is equally true when there is a generative AI system that’s doing the work.”

That doesn’t mean a student is forbidden from using AI in the process of curating or enhancing one’s work. The key, according to Markman, is being transparent in its use.

“At no point should anyone turn in an assignment that uses (AI) without acknowledging how it was used,” Markman said.

But still, the possibility for students to utilize AI in a way that violates academic policies exists.

To capture cheating efforts, the university has been looking into a variety of tools to figure out if an assignment was curated by AI and not the student. One of these tools is TurnItIn’s AI detection tool, which is currently undergoing vetting.

These tools have been subject to much controversy regarding their accuracy. Around the end of the spring 2023 semester at Texas A&M University-Commerce, uproar broke out regarding a professor who falsely accused students of using artificial intelligence.

Bentzin also feels AI detection tools aren’t accurate enough to draw such conclusions.

“I’ve used some of these detection tools, and it took my own original text and said it was from an AI (and) it took some AI-generated language and it said was original. They just don’t work,” Bentzin said.

Rather than trying to focus on catching students cheating via AI, Fleishmann insists more staff and faculty modify assignments to make them more AI-proof.

“I think one simple thing would be just for a professor to test their assignment themselves, just enter their assignments into (AI), see what it produces,” Fleishmann said. “If it’s able to generate something fairly good, try making the assignment a little bit more nuanced.”

Markman emphasizes the prospect of looking more into the greater positive impact AI can have within education when used properly.

“A combination of a person, working with (AI) tools, is almost always more effective than either the personal alone or the tool alone,” Markman said.