How educators are integrating AI into classrooms
From writing detailed research papers to conducting large scale data analysis, AI has taken on an unprecedented role in the past decade.
AI has completely altered the field of digital ethics, sparking new conversation and fierce debate. What will remain constant, however, is the presence of AI. Educators at Utah State University are well aware of this fact and are taking steps to adapt AI into a tool — students and professors can learn exactly how to do this via USU’s AI in Teaching workshops.
Neal Legler is the director of the Center for Instructional Design and Innovation and a presenter for the workshops.
“We’re trying to help faculty become comfortable and familiar with the technology themselves and also become better at helping students use it appropriately and effectively,” Legler said.
These hour-long workshops will take place throughout the school year, accessible via Zoom.
“Most faculty are coming to grips with the fact AI isn’t going away,” Legler said. “They’re wanting to understand how they can work effectively with students — to help students be prepared for the future while still having a good learning experience.”
Three different topics are currently being covered in these workshops, starting with Basics and Opportunities on Aug. 19. This first workshop aims to introduce Generative AI and instruct attendees on how it works, its benefits and its drawbacks.
“We introduce them to some of the many different AI-oriented tools available to them and to students,” Legler said. “Then we spend some time talking about the importance of defining their expectations regarding AI for their classes.”
Legler urges educators to place the objectives of their class against the usage of AI and weigh the benefits and risks of students utilizing it.
“We want them to think about their objectives and understand what it is they’re trying to accomplish,” Legler said. “Then take a look at their assignments, and say, ‘What is it that I can do now? What do I need to do with my assignments, acknowledging the existence of AI, to maximize student contribution?’”
According to a survey conducted by Intelligent.com, nearly a third of college students are using ChatGPT as of Feb. 2024. 96% of those surveyed reported using the program for schoolwork, with some using it to write entire essays. As more and more programs like ChatGPT become widely available, it’s become apparent to educators that Generative AI is something they’ll have to seriously consider when creating assignments.
“Maybe, instead of just turning in a single essay, they really focus their assignments on the writing process and then grade students on the writing process,” Legler said. “We also encourage them to try doing their assignment with AI, see if they can do it and what happens.”
Faculty will then format their AI policies based on what they’ve learned during the workshop.
“Some faculty want to say AI use is strictly prohibited,” Legler said. “There are also others who say AI use is permitted as directed within specific assignments. Then there are some who say to go ahead and use it — just be sure to use it responsibly.”
Faculty are also taught how to spot unauthorized AI use and the following actions to take once AI has been detected.
“It’s tricky — this is a hard thing for faculty to work through,” Legler said. “It’s not a fun conversation if you think maybe your student is being dishonest, so we talk through defining your syllabus, the university processes you would need to go through and some things you can do before going through these processes, like talking to the student.”
The workshop begins to look towards opportunities for enhancing a class through different AI tools.
“We start to get into how you can enhance and build up your course using AI,” Legler said. “What are some ways that you can use it to give students a more personalized experience?”
The next workshop focused on application, where attendees delved into prompt engineering and ways educators can leverage its use in perfecting lesson plans and activities. This workshop took place on Aug. 20.
“We pull up a couple different examples of things you could do with AI,” Legler said. “With ChatGPT, for example, you could create better content or create interactive activities.”
The workshop provides attendees with the chance to ask questions about their own class and come up with ways in which AI could suit their specific objectives.
“We’ll go back and forth asking about their class and just try some things,” Legler said. “We’ll pull up ChatGPT, set up some scenarios and work through them.”
Attendees will work through several specific examples, step by step, to better understand AI and how they can apply it in a realistic setting.
“We’ll take five specific tasks or things you would do as a teacher, plug in some prompts and use AI to do these tasks,” Legler said. “Maybe we’re using it to create a lesson, create variations of test questions or create an instructional image.”
As AI becomes further entrenched in the education and the professional world, new questions about ethical usage are inevitable.
Sharad Jones is an Assistant Professor of Data Analytics and Information Systems and has done research into ethical AI.
“AI, or machine learning, is trying to find patterns and data and use that to make predictions about the future,” Jones said. “There’s now a fundamental statistics question of, ‘Where did you get that data from? How is that data sampled?’”
Sampling bias is one of the biggest issues when it comes to AI, made widely known after online retailer Amazon’s discriminatory AI recruitment tool came to light. This tool prioritized male candidates and would penalize resumes of female applicants, and it was quickly discontinued after extreme bias was uncovered within the system.
“Say, for example, I’m working for Utah State University, and I want to find the best professors I can,” Jones said. “Maybe I base my search on how many publications they’ve had and the amount of grant funding they’ve brought in. I run into a risk — maybe these professors I find have had prior advantages or their parents were professors so they knew the channels to go through.”
Jones emphasizes bias as a statistics problem more than it is a fault within AI. As AI is traditionally used to organize large data sets, an age-old statistics problem becomes aggravated.
“This is the core of what ethical AI is discussing,” Jones said. “How can we ethically source, use and leverage data to still do all the things we need to do but in a more ethical and fair way?”
When it comes to mitigating bias, Jones said the resolution is dependent on the situation.
“In the context of building a model to find the best professors, the answer might be concretely defining what it means to be a good professor,” Jones said. “Maybe I need to collect a richer data set or not cast my net as widely.”
To utilize AI in a way that is both productive and ethical, Jones emphasizes keeping humans heavily involved in the creation and output of AI.
“The best thing you can do is keep a human in the loop,” Jones said. “It’s not to say humans aren’t biased, but at least we know who is taking ownership for these decisions and can seek to make a fairer decision.”
As AI continues to evolve from rudimentary to machine learning capable of a million different things, more and more educators are realizing the potential of AI in the classroom.
“It is a tool in the learning process,” Jones said. “It’s a question of understanding what we’re actually trying to teach in these classes. Is my goal to teach them how to write code, or is my goal to teach them how the algorithm is used to construct a statistical model? It should serve to assist in the parts of the process that were not directly trying to teach.”
According to Jones, students may need to develop new skills as a consequence of AI, its proper usage and its integration within academics.
“You’re going to see a lot of information now, and you’re going to need to become an exceptional discerner of truth,” Jones said.
Jones said AI is the next step in a long line of technological evolution. From the invention of the printing press, to calculators and the internet and now machine learning, educators inevitably must adapt.
“Where it differs from the past is this happened faster than any previous technologies,” Jones said. “Basically overnight, all of a sudden everybody saw the dramatic number of possibilities that increased from this one tool.”
The topic of AI stirs up a lot of debate and fear, especially in relation to education. Legler sees a brighter future for AI and aims to help other educators realize these possibilities.
“The holy grail of education is apprenticeship at scale,” Legler said. “Some of the capabilities of AI facilitate that and make that easier. We’re going to see more solutions and more things coming around to help us do that.”