2023 may be remembered as the year artificial intelligence went mainstream. New tools like ChatGPT made generative AI, especially, visible and accessible to consumers.  

On first glance, generative AI tools seem capable of uncanny feats. But they also raise concerns about accuracy, intellectual property, plagiarism, and security—issues with especially salient implications for colleges and universities. 

And generative AI represents just one facet of artificial intelligence. Iowa investigators want to expand use of machine learning, data mining, predictive models, and other AI-related research methods and build an AI talent pool. 

Surveying all these technologies, University of Iowa leaders and IT pros are imagining a future that balances the potential of AI with a clear-eyed view of its challenges.  

Starting from strategy 

Steve Fleagle, associate vice president and chief information officer, is partnering with other campus leaders and faculty in developing an AI strategy for the university. 

“Technologies and tools are evolving quickly, and it can be hard to keep up,” he says. “We want to offer an overall vision for how we use AI and a framework for supporting AI-related projects across campus.”  

Iowa IT leaders have developed guidelines focused largely on generative AI, or systems that create text, images, or other media in response to user prompts. These systems draw on vast datasets to generate new content based on common patterns and structures. 

Iowa’s guidelines note AI’s capacity for bias, emphasize safeguards for non-public data, and highlight ethical practices (for example, identifying any content or analysis developed using AI). 

IT leaders also have classified potential use cases by risk level. Low-risk applications like using AI to brainstorm ideas are generally encouraged. High-risk applications—e.g., using AI to evaluate bid responses or resumes—are strongly discouraged. 

Campus committees are developing more specific guidance and priorities for AI in research, academics, administrative functions, and health care.  

Introducing new tools 

A generative AI hub on the ITS website offers a central source for usage guidelines; links to info about data classification, research resources, and AI in teaching; and basic facts about some of the most common generative AI tools. 

Most of these tools are publicly available, not university supported, and thus only suitable for use with data that’s readily available to the public. An exception is the enterprise version of Microsoft Copilot, which the university has licensed for use by faculty, staff, and students. 

Microsoft Copilot is an AI chat tool useful for generating ideas, researching topics, creating text content, and more. Campus users should sign into the tool using their HawkIDs. Signing in with your Iowa credentials ensures that Copilot won’t access your data, save your conversation history, or train itself on info you submit. 

“Copilot helps us leverage generative AI to increase productivity without compromising data privacy,” says Manda Marshall, an AI-focused senior IT support consultant in ITS Enterprise Services. “Faculty and staff might use it as a time-saving partner for writing emails, creating presentations, or getting feedback on any written content.” 

Marshall and others are researching additional tools including Microsoft’s 365 Copilot, a separate product that integrates with other Microsoft 365 applications to offer a virtual assistant. She encourages anyone who uses any generative AI tools to engage them critically. 

“Treat their output as drafts, not final products,” she says. “Just as you wouldn’t accept everything a person says at face value, approach generative AI with discernment. Use it as a tool to enhance your own creativity and productivity.”  

Exploring uses in teaching 

The Office of the Provost, Center for Teaching, and ITS Office of Teaching, Learning, and Technology (OTLT) have developed general guidance and evolving resources for addressing generative AI in the classroom. They encourage instructors to learn about AI capabilities, talk with students about permissible uses, and—when appropriate—adapt assignments and assessments for generative AI tools. 

“It’s exciting to explore the possibilities of AI, but it’s essential to recognize its limitations and approach it mindfully,” says Vicky Maloy, OTLT associate director and lead for the Academic Technologies team. 

For faculty looking to explore, she’s developed a series of workshops with Eva Latterner, assistant director of the Center for Teaching. They also facilitate a Generative AI Faculty Interest Group and hold monthly brainstorming sessions at the Main Library. 

“Our work is largely focused on uses for generative AI in teaching and learning settings, but many of our instructional technologies are starting to incorporate AI,” Maloy says. As an example, she points to Gradescope, a feedback and assessment tool that uses AI to group student submissions and streamline grading. 

Understanding AI and its classroom implications can be daunting, but individual instructors shouldn’t feel alone in addressing the challenges and identifying opportunities. 

“We want to provide opportunities to learn and grow in a community,” Maloy says, “exploring new ways to approach this technology and apply it in our work today.” 

Building AI talent 

Though generative AI recently has captured popular headlines and campus imaginations, powerful artificial intelligence tools used in research have been making meaningful impact for years.  

As artificial intelligence continues to mature, Iowa research and IT leaders are creating new opportunities for faculty and staff to conduct research with AI, machine learning (ML), and related methods. They’re also in the early stages of developing a talent pipeline that can prepare more students schooled in AI/ML. 

Milan Sonka has spent decades using AI methods to process biomedical images. In 2019, he and colleagues established the Iowa Initiative for Artificial Intelligence (IIAI), a hub for interdisciplinary AI/ML research.  

Faculty, staff, and students affiliated with the IIAI represent engineering, medicine, business, basic sciences, humanities, and more. The initiative has funded 78 pilot projects, examining everything from flood forecasting to corporate-reputation management to patent-law rulings. 

“Shoulder-to-shoulder collaboration with IIAI consultants and domain-expert principal investigators are instrumental to these projects’ successes,” says Sonka, professor of electrical engineering and director of the IIAI. “Several pilots have led to external grant submissions, and a subset have been funded.” 

Sonka also co-chairs the campus AI research committee with Joe Hetrick, executive director of ITS Research Services. Their priorities include developing more support for interdisciplinary pilot projects that lead to larger follow-up studies. They also want to identify gaps in guidelines for AI/ML research and related research data. 

Perhaps most importantly, they want to expand student opportunities and train new generations of researchers steeped in AI/ML methods. 

Establishing ways for local companies to fund graduate research assistants—especially Ph.D. students—is one idea. Expanding education for interested students, faculty, and staff alike is another priority. It could mean offering more training sessions or more opportunities for hands-on experience with campus AI/ML experts. 

“We aim to find new sources of support for an effort that’s painfully needed,” Sonka says. “Iowa researchers are looking for staff with AI/ML expertise, and we see similar demand across local and regional industry.”