(August, 2023) Gave a talk about Self-supervised Learning at Stanford University
(June, 2023) Our LibAUC Library for Deep Learning has a major update.
(May, 2023) Congratulations to my students Zhuoning Yuan and Dixian Zhu for successfully defending their dissertations. Zhuoning will join Nextflix as Research Scientist and Dixian will join Stanford University as postdoc.
(March, 2023) Our LibAUC Library for Deep Learning has been used by many students, researchers and practitioners in their projects with more than 2,5000 times as of March 30, 2023!
(Feb, 2023) I am now an IEEE senior member.
(Feb, 2023) Gave a talk at MD Anderson Cancer Center about our LibAUC Library.
(Feb, 2023) Gave a talk at Rice University about Distributionally Robut Optimization (DRO) for ML/AI.
(January, 2023) Gave a talk at UT Southwestern Medical Center about our LibAUC library.
(January, 2023) Special Issue about Federated Learning. Please consider submitting your great work!
(October, 2022) Our paper "Contrastive Learning and Subtyping of Post-Covid-19 Lung Computed Tomography Images" Collaborated with Physicians are published at the Journal of Frontiers in Physiology!
(September, 2022) Gave talks about LibAUC and X-risk optimization at UTHealth, Amazon.
(August, 2022) Our solution to Stanford CheXpert competition by Deep AUC Maximization (implemented in LibAUC library) has been the 1st Place on the leaderboard for two years!
(June 20, 2022) Glad to gave a tutorial about Deep AUC Maximization at CVPR with Yiming Ying, Harikrishna Narasimhan and Mingrui Liu. Please check the website.
(June, 2022) Gave talks at NEC Labs America and Snap Inc. about our LibAUC library. Please check the Slides and website.
(May 2022) In collaborating with Engineering Faculty Venanzio Cichella, our proposal using ML/optimization for improving robotic motion planning received Amazon Research Award.
(April 2022)I will serve as Area Chair for NeurIPS 2022.
(April 2022) Glad to be featured on Daily Iowan News for our NSF-Amazon Project on Fair AI.
(Mar. 2022)I am excited to join Google Brain as a visiting researcher for six months working on self-supervised learning. I am hosted by Denny Zhou.
(Feb 2022) We will give a tutorial about "Deep AUC Maximization: From Algorithms to Practice" at CVPR 2022. Stay Tuned.
(Feb 2022) Invited to Serve as Area Chair of ICML 2022.
(Jan 2022) Awarded: a NSF-Amazon Joint Grant ($800K) for Fair AI, with Co-PI Qihang Lin, and Mingxuan Sun.
(Jan 2022) One Demo about Interpretable X-ray Image Classification was accepted to SPIE Medical Imaging Conference. Demo was made by Alan Sorrill and Gang Li.
(Jan 2022) One paper about Improved Convergence for AUPRC maximization was accepted to AISTATS 2022.
(Jan 2022) One paper about Compositional Training for Deep AUC maximization was accepted to ICLR 2022.
(Jan 2022) I gave the first Invited Talk in 2022 at UberAI about LibAUC and AUPRC maximization.
I am in the list of Top 2% most-cited scientists in various disciplines named by Stanford University Check here.
(Oct 2021) Gave Invited Talks at INFORMS about Deep AUC Maximization.
(Sep 2021) Five awesome papers are accepted to NeurIPS 2021, including the MIT AICures winning method by deep AUPRC maximization and our novel stochastic aproach for DRO optimization, and three other interesting online learning works.
(Sep 2021) Gave a TechTalk at ACM@UIOWA about Deep AUC Maximization. Check the slides.
(Sep 2021) Our work on deep learning with imbalanced data won the NSF award from NSF RI program.
(June 2021) I will give an invited talk at Google about our recent work on deep AUC Maximization! Check the Slides.
(June 2021) My former PhD student Mingrui Liu will join CS@Geroge Mason University in Fall 2021! When he was in my group, Mingrui has done some great works in min-max optimization and their applications in ML, e.g. GAN and deep AUC Maximization. He will hire several research assistants at GMU. Contact him if you are interested!
(May 2021) Recently, we developed a simple and intuitive proof for the convergence of Adam (the widely used practical Adam not the original analyzed Adam). The key is the increasing or large momentum parameter for the tracking the first order moment. Our new analysis of Adam enables one to develop variants of Adam for more difficult problems, including primal-dual Adam for min-max, bilelvel, compositional Adam for compositional problems. Please check the preprint for more details.
(May 2021) In collaboration with the DIVE lab at TAMU led by Prof. Shuiwang Ji, our LibAUC (including AUROC, AUPRC) helped the team to achieve the 1st place at the MIT AI Cures Challenge. Our AUC maximization algorithms improve the AUROC by 2%+ and AUPRC by 5%+ over the baseline models. The MIT AI Cures challenge is to improve machine learning models for predicting antibacterial properties, which can help fight secondary effects of COVID. Great efforts to the team members, especially (in a-b order) Youzhi Luo@TAMU, Qi Qi@UIowa, Zhao Xu@TAMU , Zhuoning Yuan@UIowa. Code for stochastic AUPRC Maximization will be soon released in our libAUC website.
(May 2021) Our paper about non-convex non-concave min-max optimization was accepted to JMLR.
(May 2021) We have released a preprint about AUPRC Optimization with Provable Convergence. Please check it here.
(May 2021) Two Papers Accepted to ICML 2021.
(April 2021) We have launched a project for developing a library for deep AUC maximization. Please Check our project website here.
(April 2021) I will serve as Area Chair for NeurIPS 2021!
(March 2021) Our paper about non-convex concave min-max optimization was accepted to Optimization Methods and Software!
(October 2020) Invited to give a talk "Deep AUC Maximization and Applications in
Medical Image Classification" at RPI in November!
(October 2020) Invited to give a talk "Deep AUC Maximization and Applications in
Medical Image Classification" at ICONIP 2020 in November!
(October 2020) PhD positions available. If you are interested in deep learning and familiar with deep learning tools such as Tensorflow and Pytorch, and like to explore more opportunities of deep learning in medical imaging, please email me!
(September 2020) Three papers were accepted to NeurIPS 2020!
(09/06/2020) Our deep AUC maximization method achieves the 1st place on Chexpert Competition (our team is named DeepAUC-v1 ensemble) organized by ML group at Stanford University. We joined the competition at the end of May. Great efforts by my student Zhuoning Yuan, other members contributing to the project include Yan Yan, Zhishuai Guo, Mingrui Liu. Details will be available soon. Stay tuned!
(September 2020) Serve as Senior PC for AAAI 2021!
(August 2020) My student Mingrui Liu sucessfully defended his thesis. Next stop is Postdoc at Boston University. Congratulations!
(June 2020) Three papers are accepted to ICML 2020!
(March 2020) My postdoc Yan Yan will join Washington State University as an assistant professor in Fall 2020. Yan has done several great works in min-max optimization. Congratulations to him!
(March 2019) I was named Dean's Excellence in Research Scholar!
(Feb. 2019) Received the NSF CAREER Award. Thanks to NSF!
(Feb. 2019) 1 paper were accepted to ICLR 2019, and 1 paper were accepted to AISTATS 2019.
(Sep. 2018) 4 papers were accepted to NIPS 2018.
(July 2018) Gave an invited talk on "First-order Stochastic Algorithms for Escaping From Saddle Points in Almost Linear Time" at ISMP, Bordeaux, France.
(June 2018) Gave an invited talk on "First-order Stochastic Algorithms for Escaping From Saddle Points in Almost Linear Time" at Peking University.
(June 2018) Xiaoxuan Zhang and Zhe Li successfully defended their thesis.
(May 2018) 1 paper was accepted by KDD 2018, 2 papers were accepted by IJCAI 2018, and 4 papers were accepted by ICML 2018.
(March 2018) Our paper "RSG: Beating Subgradient Method without Smoothness and/or Strong Convexity" was accepted to JMLR with minior revision.
(Dec 2017) Our paper “A Simple Analysis for Exp-concave Empirical Minimization with Arbitrary Convex Regularizer" was accepted to AISTATS 2018 with oral presentation.
(September 2017) 4 papers were accepted to NIPS 2017. Congratulations to my students and co-authors!
(May 2017) Gave a talk "What You Should Know About Machine Learning" at West High School in Iowa City Slides
Two papers about "Homotopy Smoothing for Non-smooth optimization" and "Improved Dropout for shallow and deep learning" were Accepted by NIPS 2016. Congratulations to my students and co-authors.
Paper "Sparse Learning for Large-scale and High-dimensional Data: A Randomized Convex-concave Optimization Approach" was Accepted by ALT 2016
Paper "Online Asymmetric Active Learning with Imbalanced Data" was Accepted by KDD 2016
Paper "Optimal Stochastic Strongly Convex Optimization with a Logarithmic Number of Projections" was Accepted by UAI 2016
Two Papers Accepted by ICML 2016
Paper "Learning Attributes Equals Multi-Source Domain Generalization" Accepted by CVPR 2016
Paper "Fast and Accurate Refined Nystrom based Kernel SVM" Accepted by AAAI 2016
Paper "Stochastic Optimization for Kernel PCA" Accepted by AAAI 2016
Tutorial "Big Data Analytics: Optimization and Randomization" presented at ACML 2015, Hong Kong. [slides]
Paper "On Data Preconditioning for Regularized Loss Minimization." accepted by Machine Learning Journal
Tutorial "Big Data Analytics: Optimization and Randomization" presented at KDD 2015, August, Sydney. [slides]
Tutorial "Stochastic Optimization for Big Data Analytics" presented at SDM 2014, April, Pennsylvania. [slides]
Invited talk "Randomized Algorithms in Machine Learning" at Applied Mathematical and Computational Sciences Seminar, UIowa. [slides]
Invited talk "Distributed Optimization for Big Data Learning" at Statistic and Actuarial Science Department, UIowa, October 02, 2014
Looking for motivated graduate students. If you are interested in machine learning (e.g., learning and optimization for big data, deep learning) then please don't hesitate to contact me.