Study enhancing learning methods for AI and machine learning systems wins IEEE award

Grid lines and numerical text on either side of a silhouette.

By Peter Murphy

Published May 21, 2024

A paper authored by Seyyedali Hosseinalipour (Ali Alipour) received the Institute of Electrical and Electronics Engineers (IEEE) Communications Society William R. Bennett Prize. The research could enhance learning methods used by artificial intelligence (AI) and machine learning (ML) systems.

Print
“Our findings indicate that incorporating this degree of freedom can significantly enhance model prediction performance in federated learning. ”
Ali Alipour, Assistant Professor
Department of Electrical Engineering

According to IEEE, the award “recognizes outstanding original papers published in IEEE/ACM Transactions on Networking or the IEEE Transactions on Network and Service Management” within the last three years.

Alipour’s research enhances federated learning, a method researchers use to secure private information while collecting data to better train AI and ML systems.  

AI and ML systems collect data from various sources to enhance their capabilities. The data collection associated with each of these methods, however, come with privacy concerns. Data collected from personal devices like smartphones and other electronics could be stored in a single location, like a cloud server. Federated learning allows the data to remain on the device. The only data sent to a central server during federated learning is the AI or ML model parameters.

“Federated learning was first developed by researchers at Google, initially aimed at enhancing next-word prediction for smartphone keyboards,” Alipour says. “Today, technology giants like NVIDIA apply federated learning to sectors such as healthcare, where protecting patient data privacy is crucial,” Alipour says.

Alipour’s new method, multi-stage hybrid federated learning (MH-FL), allows devices to interact with each other before sending any information to a server. The devices can work together to refine learning methods associated with the AI and ML systems before sharing information to a central server.

“MH-FL introduces an additional layer of flexibility by enabling client-to-client interactions, also known as device-to-device interactions. Our findings indicate that incorporating this degree of freedom can significantly enhance model prediction performance in federated learning,” Alipour says. “Additionally, it contributes to reduced energy consumption and latency, optimizing both the efficiency and effectiveness of the learning process.”

Seyyedali Hosseinalipour.

The work in this award-winning paper has set the foundation for Alipour’s work with federated learning. He has continued to develop and explore different federated learning techniques.

“Ali is a leading researcher in the analysis and modeling of modern wireless networks, specifically in the application of machine-learning techniques to the design and implementation of next-generation wireless networks,” says Jon Bird, professor and chair in the Department of Electrical Engineering. “This achievement is especially remarkable for Ali, given his current position as a junior assistant professor.”

The William R. Bennett Prize is competitive. Out of the potentially thousands of papers considered, just one is selected for the award.

“As researchers, we are always thrilled when our ideas are well-received. I hope to use this excitement as motivation to continue contributing to the rapidly evolving fields of AI and ML,” Alipour says. “I also hope that this enthusiasm not only drives our current research but inspires further innovations and discoveries.”