Generative artificial intelligence is rapidly transforming higher education, yet there is limited real-world data on how students use these tools in their learning.
The ITS Office of Teaching, Learning, and Technology (OTLT) is partnering with University of Iowa instructors to fill that gap. Currently, their research spans 41 courses across 13 departments. From spring 2024 to fall 2025, 12,225 UI students participated in the research.
The goal isn’t advocacy, but evidence-based understanding that can support the development of effective instructional strategies.
“AI represents a major shift with both potential benefits and risks for learning, affecting assessments, writing, tutoring and office hours, and student help-seeking behavior,” Jane Russell, director of OTLT Research and Analytics, explains.
Understanding how students interact with AI
Erin Irish, associate professor of biology, teaches Diversity of Form and Function, an introductory biology course, to about 500 students. The course uses an eTextbook with an accompanying online homework platform. Like many other publisher platforms, they incorporated an AI chatbot.
“The chatbot gives students another option for seeking help and a starting point for problem solving,” Irish says. “They could come to me, their TA, or the machine in front of them for assistance.”
To better understand students’ interactions with AI, OTLT analyzed hundreds of conversations between students and chatbots, digging into how these exchanges unfold. They examined the types of questions students ask, how students navigate the interactions, and what engagement patterns may be associated with stronger learning outcomes.
Beyond conversation data, they used voluntary surveys to gather input from thousands of UI students on AI’s role in their learning, including expectations, concerns, and ideas for how AI could be integrated into the classroom.
“Through our work with Jane’s team, we’re discovering that while students find asking AI to be more convenient, they’re approaching the responses through a critical lens,” Irish says. “They’re seeing how and when they can use this tool but recognize it isn’t where they stop.”
Heather Dunn, clinical associate professor of nursing, participated in an OTLT-facilitated pilot of Cogniti, a platform designed to let instructors build custom chatbot agents that can be given specific instructions and resources to assist student learning.
“Intentional deployment of AI was crucial for me, and I wanted to weave it into an assignment that promoted active learning,” Dunn says.
She developed a simulated patient encounter. Students interviewed the chatbot to indicate what physical exams and tests they’d perform on a patient who presented similar symptoms, then arrived at a primary diagnosis. The chatbot provided feedback about their interaction, and the students compared the chatbot’s suggestions to scientific literature, discussing their findings in class.
At the end of the semester, OTLT reviewed the data with Dunn, including the number of conversations, length of interactions, and occurrence, such as time of day and day of the week.
“The students found a lot of value in the assignment, asking to repeat it in spring 2026,” Dunn says. “I was surprised by the lack of spontaneous engagement, though. The data showed obvious access trends that correlated to when assignments were due, and it’s allowed me to adjust its implementation.”
Early findings
“There are a lot of concerns about academic integrity and student learning and also concerns from students on over-reliance and being wrongly accused of misconduct,” Russell says. “At the same time, companies often make hype-driven claims.”
Of the students surveyed, 35% say that AI is changing their experience as a student. Most students want AI to be restricted in some way in their courses to protect academic integrity while still allowing for assistance when appropriate.
Preliminary findings indicate that the nature of students’ dialogues with AI is critical. The most common uses include concept explanation and additional practice.
Students also tend to turn to AI when human help isn’t available. Peak usage occurred between 8 p.m. and midnight on Mondays and Tuesdays, with a noticeable spike on Sundays. While students rated the quality of AI assistance lower than that of human help, they valued the convenience and reported feeling more comfortable seeking support from AI tutors.
The team presented more in-depth findings and how instructors can apply data to their teaching strategies in several campus events. They also shared it more broadly at the Learning Analytics and Knowledge (LAK) conference and through a paper based on the research.
“We have amazing data available, and with the right questions, it can tell us so much about our students and their learning,” Irish says. “I’m happy to participate in this research and build data sets that can potentially help other instructors.”