SIAM News Blog

MDS22 Prize Spotlight: Weijie Su

Weijie Su, University of Pennsylvania, is the 2022 recipient of the SIAM Activity Group on Data Science Early Career Prize, which was awarded at the 2022 SIAM Conference on Mathematics of Data Science (MDS22) on Thursday, September 29 in San Diego, California. He gave a lecture associated with the prize, titled “When Will You Become the Best Reviewer of Your Own Papers? A Mechanism-Design-Based Approach to Estimation,” on Thursday afternoon.

The SIAM Activity Group on Data Science (SIAG/DATA) awards the prize every two years to an outstanding early career researcher in the mathematics of data science for distinguished contributions to the field in the six calendar years prior to the year of the award. This is the first time that the prize is being awarded. 

Su is an associate professor at the University of Pennsylvania. He is a co-director of Penn Research in Machine Learning. Prior to joining Penn, he received his Ph.D. from Stanford University in 2016 under the supervision of Emmanuel Candès and his bachelor’s degree from Peking University in 2011. His research interests include deep learning theory, privacy-preserving data analysis, optimization, and high-dimensional statistics. He is a recipient of the Stanford Theodore W. Anderson Dissertation Award in 2016, an NSF CAREER Award in 2019, an Alfred Sloan Research Fellowship in 2020, and the IMS Peter Gavin Hall Prize in 2022.

Q: Why are you excited to receive the SIAG/DATA Early Career Prize?

A: I’m very excited and humbled to receive the SIAG/DATA Early Career Prize. Data science is an exciting field where we can leverage data to solve problems and inform decisions in increasingly complex environments. This raises many interesting mathematical problems for us to solve. I feel lucky that I joined this emerging line of research early on, having the opportunity to work with many wonderful collaborators. In particular, I’m grateful to my Ph.D. advisor Emmanuel Candès and my mentors Cynthia Dwork and Stephen Boyd for imparting their research interests and viewpoints with me, which I believe have been reflected in my work that is recognized through this prize.

On a personal note, I am bringing my family with me to attend MDS22. Winning this prize gives me the best opportunity to convince my eight-year-old son that it is fun to do math, as he will see me receive the prize and give the prize lecture.

Q: Could you tell us a bit about the research that won you the prize?

A: I received the prize for my work on the connection between optimization and ordinary differential equations and private data analysis, among other things. One paper of mine, “A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights,” brings ideas of continuous-time formulations to analyze accelerated optimization methods. This approach can often significantly simplify the analysis of and guide the design of optimization methods by using tools such as Lyapunov functions in their continuous-time counterparts - ordinary differential equations. In the case of Nesterov’s accelerated gradient method, it results in a simple second-order differential equation and its damping ratio sheds light on how this celebrated algorithm is accelerated.

With collaborators, I developed an extension of differential privacy for reasoning about privacy considerations in machine learning. This framework allows one to losslessly track privacy costs for some basic primitives, and as a result, it is especially convenient and sharp when applied to private deep learning. I am now working with some industry collaborators to try to implement it in practice.

Q: What does your work mean to the public?

A: Optimization is a workhorse in many successful applications of machine learning and deep learning, and it is always crucial to accelerate optimization methods if possible, in order to save computational costs. My work on optimization can often provide a simple perspective on these methods when one wants to analyze them.

One concern about deploying machine learning technologies, however, is privacy; specifically, that our data on location, web search histories, media consumption, and social networks can be leveraged in an adversarial way by attackers. Theoretical analysis shows that my work can improve the predictive performance of machine learning models, though, as is often the case, substantial modifications are needed to carry over the improvement to internet applications. I find it exciting.

Q: What does being a member of SIAM mean to you?

A: I very much enjoy being a member of the SIAM community. It offers many opportunities and resources for me to interact with people of similar research interests and to learn about various viewpoints from different but related disciplines.

blog comments powered by Disqus