doctor viewing xray
New GPU computing nodes and consulting help researchers harness deep learning

Milan Sonka has spent decades using artificial intelligence to process and analyze biomedical images—MRIs, CTs, ultrasounds, and other tools used to diagnose and treat disease. Now he and colleagues want to help other investigators apply artificial intelligence (AI) to a host of questions.  

They’ve teamed up with ITS Research Services to expand graphics processing unit (GPU) computing capabilities, providing relatively low-cost solutions for AI and other applications that require significant computing power. They’re also starting a project to advise investigators interested in using AI methods. 

Faster, cheaper hardware, vast collections of data, and advances in techniques like deep learning make AI accessible to researchers across disciplines, says Sonka, professor of electrical and computer engineering, co-director of the Iowa Institute for Biomedical Imaging, and College of Engineering associate dean for graduate programs and research. 

“The entire research community is seeing a revolution in how data are shared, used, and analyzed,” Sonka notes. “This includes researchers who did not traditionally use quantitative techniques, data mining, machine learning, or predictive models.”

Reading biomedical images 

Current work by Sonka’s team illustrates the potential of AI-assisted research. Among other projects, they’re developing ways to more clearly segment images of tumors and other anomalies, predict heart-transplant complications from images and biomarkers, and determine intensive care unit stays from images of the psoas muscle running from spine to hip. 

Each of these projects demands huge volumes of data, software to sift through it and spot patterns, and a massive amount of computing power. GPU computing—using graphics cards built for everyday image rendering tasks usually handled by enterprise-grade cards costing 10 times as much—can address the latter need. 

“GPU hardware combined with increasingly available data enables machine learning at a level not possible before,” Sonka says. GPU makes AI-assisted research feasible, fast, and effective—at least for teams that know where to begin. 

Meeting research community demand 

Sonka and other College of Engineering faculty members were quick to realize the potential of GPU computing. For help with hardware, they turned to Danny Tang, the college’s chief technology officer. 

“Researchers started coming to us wanting to build computers that used consumer-grade graphics cards to run deep learning applications,” he says. Rather than building systems on the fly, Tang worked with Ben Rogers, senior director of Research Services, to explore larger options. 

The result is a scalable, sustainable GPU extension of the Argon high-performance computing cluster designed for AI-assisted research and other projects.

“GPU computing commonly provide results two to 20 times faster than other methods,” says Rogers. “The two areas where we’re seeing the biggest impact at present are in deep learning and molecular modeling, but this is likely to grow as other codes are optimized.”

Bridging ideas and hardware 

“Research groups realize the value of these approaches, but some may lack the training to use them,” Sonka says. To take AI-assisted research at Iowa further, he and Tang are working with colleagues to establish the Engineering Initiative for Artificial Intelligence. 

The project will provide consulting and hardware for engineering faculty and, eventually, other investigators interested in artificial intelligence. Still in its formative stages, the initiative aims to help other researchers get started with AI.  

“Work with these tools is booming, but not everyone knows how to begin. We want to bridge the gap between ideas and hardware,” Tang says. 

“We think this is the right thing to do and the right time to do it,” he says, noting MIT’s recent announcement that it will spend $1 billion to create a new college dedicated to artificial intelligence. “AI and deep learning are taking over the world.” 

Year in Review 2018
This story is part of a Year in Review highlighting some of OneIT's accomplishments from 2018.Go to Year in Review