Jason Leehome



  1. Jason Lee Blogger
  2. Jason London
  3. Jason Lee Home Page

UPCOMING PERFORMANCES: Cinderella: Der Prinz Eduard von Winterstein Theater, Annaberg-Buchholz, DE 04., 07., 10., 24., 31. Oktober 2020 20., 22. Jason Lee - Home This site is dedicated to Economics 101 summer session II. You will find useful information and links.

Jason D. Lee
“firstname”+“lastname”@princeton.edu or WeChat
Google scholar and Twitter

Jason Leehome

About Me

I am an assistant professor of Electrical Engineering and Computer Science (secondary) in Princeton University and a member of the Theoretical Machine Learning Group. Previously, I was a member of the IAS and an assistant professor at USC for three years. Before that, I was a postdoc in the Computer Science Department at UC Berkeley working with Michael I. Jordan, and also collaborated with Ben Recht. I received my PhD in Applied Math advised by Trevor Hastie and Jonathan Taylor. I received a BS in Mathematics from Duke University advised by Mauro Maggioni.I am a native of Cupertino, CA.

My research interests are broadly in

  • Foundations of Deep Learning (slides)(video)

  • Representation Learning (slides)(video)

  • Foundations of Deep Reinforcement Learning (slides)(video 1)(video 2)

Princeton PhD students interested in machine learning, statistics, or optimization research, please contact me; I advise students in Computer Science, Electrical Engineering, Math, ORFE, and PACM. I am recruiting PhD students and postdoctoral scholars starting in 2021 at Princeton University, please email me a CV apply.

My current focus is on machine learning with a focus on foundations of deep learning, reprsentation learning, and deep reinforcement learning. See my talk at MIT slides and video or my tutorial at the Simons Institute: tutorial slides and video.

Jason lee homepage

I am also happy to host remote visitors. Summer visitors please contact me around February to schedule your visit. See a list of past visitors at here.

Awards

  • Sloan Research Fellow in Computer Science, Alfred P. Sloan Foundation

  • Finalist for Best Paper Prize for Young Researchers in Continuous Optimization

Hometown
  • ICML 2018 Workshop on Nonconvex Optimization for MLBest Paper Award for ‘‘Algorithmic Regularization in Learning Deep Homogeneous Models: Layers are Automatically Balanced'

  • NIPS 2016 Best Student Paper Award for ‘‘Matrix Completion has no Spurious Local Minima'.

Selected Publications

  • Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot.
    Jingtong Su, Yihang Chen, Tianle Cai, Tianhao Wu, Ruiqi Gao, Liwei Wang, and Jason D. Lee.
    NeurIPS 2020.

  • Predicting What You Already Know Helps: Provable Self-Supervised Learning.
    Jason D. Lee, Qi Lei, Nikunj Saunshi, and Jiacheng Zhuo.

  • Towards Understanding Hierarchical Learning: Benefits of Neural Representations.
    Minshuo Chen, Yu Bai, Jason D. Lee, Tuo Zhao, Huan Wang, Caiming Xiong, and Richard Socher.
    NeurIPS 2020.

  • Few-Shot Learning via Learning the Representation, Provably.
    Simon S. Du, Wei Hu, Sham M. Kakade, Jason D. Lee, and Qi Lei.

  • Shape Matters: Understanding the Implicit Bias of the Noise Covariance.
    Jeff Z. HaoChen, Colin Wei, Jason D. Lee, and Tengyu Ma.

  • Beyond Linearization: On Quadratic and Higher-Order Approximation of Wide Neural Networks .
    Yu Bai and Jason D. Lee.
    ICLR 2020.

Jason lee home inspector
  • On the Theory of Policy Gradient Methods: Optimality, Approximation, and Distribution Shift
    Alekh Agarwal, Sham M. Kakade, Jason D. Lee, and Gaurav Mahajan.
    JMLR.

  • Gradient Descent Finds Global Minima of Deep Neural Networks
    Simon S. Du, Jason D. Lee, Haochuan Li, Liwei Wang, and Xiyu Zhai.
    ICML 2019.

  • Gradient Descent Converges to Minimizers.
    Jason D. Lee, Max Simchowitz, Michael I. Jordan, and Benjamin Recht.
    COLT 2016

  • Matrix Completion has No Spurious Local Minimum.
    Rong Ge, Jason D. Lee, and Tengyu Ma.
    Best Student Paper Award at NeurIPS 2016.

  • Exact Post-Selection Inference with the Lasso.
    Jason D. Lee, Dennis L Sun, Yuekai Sun, and Jonathan Taylor.
    Annals of Statistics 2016.

Jason D. Lee
“firstname”+“lastname”@princeton.edu or WeChat
Google scholar and Twitter

About Me

I am an assistant professor of Electrical Engineering and Computer Science (secondary) in Princeton University and a member of the Theoretical Machine Learning Group. Previously, I was a member of the IAS and an assistant professor at USC for three years. Before that, I was a postdoc in the Computer Science Department at UC Berkeley working with Michael I. Jordan, and also collaborated with Ben Recht. I received my PhD in Applied Math advised by Trevor Hastie and Jonathan Taylor. I received a BS in Mathematics from Duke University advised by Mauro Maggioni.I am a native of Cupertino, CA.

My research interests are broadly in

Jason Lee Blogger

  • Foundations of Deep Learning (slides)(video)

  • Representation Learning (slides)(video)

  • Foundations of Deep Reinforcement Learning (slides)(video 1)(video 2)

Princeton PhD students interested in machine learning, statistics, or optimization research, please contact me; I advise students in Computer Science, Electrical Engineering, Math, ORFE, and PACM. I am recruiting PhD students and postdoctoral scholars starting in 2021 at Princeton University, please email me a CV apply.

My current focus is on machine learning with a focus on foundations of deep learning, reprsentation learning, and deep reinforcement learning. See my talk at MIT slides and video or my tutorial at the Simons Institute: tutorial slides and video.

I am also happy to host remote visitors. Summer visitors please contact me around February to schedule your visit. See a list of past visitors at here.

Awards

  • Sloan Research Fellow in Computer Science, Alfred P. Sloan Foundation

  • Finalist for Best Paper Prize for Young Researchers in Continuous Optimization

  • ICML 2018 Workshop on Nonconvex Optimization for MLBest Paper Award for ‘‘Algorithmic Regularization in Learning Deep Homogeneous Models: Layers are Automatically Balanced'

  • NIPS 2016 Best Student Paper Award for ‘‘Matrix Completion has no Spurious Local Minima'.

Selected Publications

  • Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot.
    Jingtong Su, Yihang Chen, Tianle Cai, Tianhao Wu, Ruiqi Gao, Liwei Wang, and Jason D. Lee.
    NeurIPS 2020.

  • Predicting What You Already Know Helps: Provable Self-Supervised Learning.
    Jason D. Lee, Qi Lei, Nikunj Saunshi, and Jiacheng Zhuo.

  • Towards Understanding Hierarchical Learning: Benefits of Neural Representations.
    Minshuo Chen, Yu Bai, Jason D. Lee, Tuo Zhao, Huan Wang, Caiming Xiong, and Richard Socher.
    NeurIPS 2020.

  • Few-Shot Learning via Learning the Representation, Provably.
    Simon S. Du, Wei Hu, Sham M. Kakade, Jason D. Lee, and Qi Lei.

  • Shape Matters: Understanding the Implicit Bias of the Noise Covariance.
    Jeff Z. HaoChen, Colin Wei, Jason D. Lee, and Tengyu Ma.

Jason London

  • Beyond Linearization: On Quadratic and Higher-Order Approximation of Wide Neural Networks .
    Yu Bai and Jason D. Lee.
    ICLR 2020.

Jason Leehome
  • On the Theory of Policy Gradient Methods: Optimality, Approximation, and Distribution Shift
    Alekh Agarwal, Sham M. Kakade, Jason D. Lee, and Gaurav Mahajan.
    JMLR.

  • Gradient Descent Finds Global Minima of Deep Neural Networks
    Simon S. Du, Jason D. Lee, Haochuan Li, Liwei Wang, and Xiyu Zhai.
    ICML 2019.

  • Gradient Descent Converges to Minimizers.
    Jason D. Lee, Max Simchowitz, Michael I. Jordan, and Benjamin Recht.
    COLT 2016

  • Matrix Completion has No Spurious Local Minimum.
    Rong Ge, Jason D. Lee, and Tengyu Ma.
    Best Student Paper Award at NeurIPS 2016.

Jason Lee Home Page

  • Exact Post-Selection Inference with the Lasso.
    Jason D. Lee, Dennis L Sun, Yuekai Sun, and Jonathan Taylor.
    Annals of Statistics 2016.