jjzhu I am the head of an independent research group at the Weierstrass Institute for Applied Analysis and Stochastics, Berlin. Previously, I worked as a postdoctoral researcher in machine learning at the Max Planck Institute for Intelligent Systems in Tübingen, Germany. My Ph.D. study was in optimization and numerical analysis, at the University of Florida. See here for a short bio. I also write a non-research blog here. However, the update frequency depends on how busy I am at the moment.

Overall, I am interested in computational algorithms and dynamical systems. My group focuses on research in state-of-the-art machine learning and optimization. Specifically, I started my research career in optimization and subsequently became interested in robust probabilistic machine learning and kernel methods. That requires us to use computational optimization tools that can manipulate probability distributions, which are inherently infinite-dimensional. It led me to my current interests in variational methods for machine learning and optimization over probability distributions, rooted in the theory of gradient flows and optimal transport.

For example, in some of my previous works, I invented robust probabilistic ML algorithms that can protect against distribution shifts using principled kernel methods. Those optimization algorithms have deep theoretical roots in dynamical system theory such as PDEs. Following that and after moving to Berlin, I dedicate my current research to interfacing large-scale computational algorithms in machine learning/optimization with dynamical system theory such as (PDE) gradient flows and optimal transport. Recently, I became interested in the Hellinger geometry (a.k.a. Fisher-Rao space) and collaborated with Alexander Mielke on kernel methods and (Wasserstein-)Fisher-Rao, a.k.a. (spherical-)Hellinger-Kantorovich, gradient flows.

To get in touch, click the icon at the bottom of the page.

Open positions

News and updates

Twitter feed