Seminar:
Federated Learning for Healthcare

When:
11:00 am
Wednesday November 20th, 2024
Where:
Room 3107
Patrick F. Taylor Hall

 

 

ABSTRACT

As healthcare facilities collect ever-growing volumes of health data, these data are often siloed due to policies and regulations designed to protect patient privacy. Federated learning has emerged as a solution, enabling collaborative model training across institutions without requiring data sharing. However, existing approaches often overlook the unique heterogeneity challenges in healthcare, which can exacerbate health disparities. In this talk, I will present my work on addressing healthcare heterogeneity through a novel asymmetrical reciprocity learning approach, which helps reduce health disparities. Additionally, I’ll discuss how medical foundation models can significantly enhance federated learning performance, especially when training data is limited. Current models, however, struggle to process multimodal, multi-sourced, and private health data effectively. To tackle this, I introduce a new task: federated medical knowledge injection. I will share a benchmark for this task, along with an advanced mixture-of-experts-based parameter-efficient fine-tuning strategy that enables more robust performance on diverse medical datasets.

Jiaqi Wang

Jiaqi Wang

Pennsylvania State University

Jiaqi Wang is a Ph.D. candidate in the College of Information Sciences and Technology at The Pennsylvania State University. His research focuses on federated learning, healthcare informatics, and multimodal foundation models. His work has been published in leading conferences, such as NeurIPS, ICML, KDD, IJCAI, AAAI, EMNLP, and ACL. Jiaqi holds a B.E. from Zhejiang University and an M.S. from the University of Georgia. For more information, please visit his website: https://jackqqwang.github.io.