Jinchi Lv  
 
 

Jinchi Lv

McAlister Associate Professor in Business Administration
Data Sciences and Operations Department
Marshall School of Business
University of Southern California
Los Angeles, CA 90089

Associate Fellow
USC Dornsife Institute for New Economic
Thinking (INET)


jinchilv (at) marshall.usc.edu
Office: BRI 400A
Phone: (213) 740-6603
Fax: (213) 740-7313

 

Short bio [CV]

Jinchi Lv is McAlister Associate Professor in Business Administration in Data Sciences and Operations Department of the Marshall School of Business at the University of Southern California, and an Associate Fellow of USC Dornsife Institute for New Economic Thinking (INET). He received his Ph.D. in Mathematics from Princeton University in 2007 under the supervision of Professor Jianqing Fan. His research interests include deep learning, personalized medicine and choices, selective inference and false discovery rate control, networks, high-dimensional statistics, big data problems, statistical machine learning, neuroscience and business applications, and financial econometrics.

His papers have been published in journals in statistics, economics, information theory, biology, and computer science, and one of them was published as a Discussion Paper in Journal of the Royal Statistical Society Series B (2008). He serves as an associate editor of the Annals of Statistics (2013-present) and Statistica Sinica (2008-present). He is the recipient of the Royal Statistical Society Guy Medal in Bronze (2015), NSF Faculty Early Career Development (CAREER) Award (2010), USC Marshall Dean's Award for Research Excellence (2009), and Zumberge Individual Award from USC's James H. Zumberge Faculty Research and Innovation Fund (2008).


 
Representative Publications
  • Candès, E. J., Fan, Y., Janson, L. and Lv, J. (2016). Panning for gold: Model-free knockoffs for high-dimensional controlled variable selection. Manuscript. [PDF]

    [Finding the key causal factors in large-scale applications is much beyond the task of prediction. Quantifying the variability, reliability, and reproducibility of a set of discovered factors is central to enabling valid and credible scientific discoveries and investigations. How can we design a variable selection procedure for high-dimensional nonlinear models with statistical guarantees that the fraction of false discoveries can be controlled? This paper provides some surprising insights into this open question.]

  • Ren, Z., Kang, Y., Fan, Y. and Lv, J. (2016). Tuning-free heterogeneity pursuit in massive networks. Manuscript. [PDF]

    [Heterogeneity is a major feature of large-scale data sets in the big data era, powering meaningful scientific discoveries through the understanding of important differences among subpopulations of interest. How can we uncover the heterogeneity among a large collection of networks in a tuning-free yet statistically optimal fashion? This paper provides some surprising insights into this question.]

  • Fan, Y., Kong, Y., Li, D. and Lv, J. (2016). Interaction pursuit with feature screening and selection. Manuscript. [PDF]

    [Understanding how features interact with each other is of paramount importance in many scientific discoveries and contemporary applications. To discover important interactions among features in high dimensions, it has been a convention to resort to some structural constraints such as the heredity assumption. Yet some key causal factors can become active only when acting jointly, but not so when acting alone. How can we go beyond such structural assumptions for better flexibility in real applications? This paper provides some surprising insights into this question.]

  • Chen, K., Uematsu, Y., Lin, W., Fan, Y. and Lv, J. (2016). Sparse orthogonal factor regression. Manuscript. [PDF]

    [How are memory states with different time constants encoded in different brain regions? How can we determine the number of key memory components? Understanding the meaningful associations among a large number of responses and predictors is key to many such contemporary scientific studies and investigations. This paper provides a unified framework that enables us to probe the large-scale response-predictor association networks through different layers of latent factors with interpretability and orthogonality.]

  • Fan, Y. and Lv, J. (2016). Innovated scalable efficient estimation in ultra-large Gaussian graphical models. The Annals of Statistics 44, 2098-2126. [PDF]

    [Large precision matrix estimation has long been perceived fundamentally different from large covariance matrix estimation. What if we can innovate the data matrix and convert the former into the latter? This paper provides a surprisingly simple procedure for such a purpose that comes with extreme scalability and statistical guarantees.]

  • Lv, J. and Liu, J. S. (2014). Model selection principles in misspecified models. Journal of the Royal Statistical Society Series B 76, 141–167. [PDF]

    [There has been a long debate on whether AIC or BIC principles may dominate one another for model selection (in correctly specified models). It is well-known that “all models are wrong, but some are more useful than others.” How does model misspecification impact model selection? This paper provides some surprising insights into this fundamental question and unveils that both principles can work in harmony through a Bayesian view.]

  • Fan, Y. and Lv, J. (2013). Asymptotic equivalence of regularization methods in thresholded parameter space. Journal of the American Statistical Association 108, 1044–1061. [PDF]

    [There has been a long debate on whether convex or nonconvex regularization methods may dominate one another. What if both classes of methods can be close to each other when viewed from a new angle? This paper unveils some surprising insights of a small-world phenomenon into this question.]

  • Lv, J. (2013). Impacts of high dimensionality in finite samples. The Annals of Statistics 41, 2236–2262. [PDF]

    [What are the fundamental impacts of high dimensionality in finite samples? This paper provides both a probabilistic view and a nonprobabilistic (geometric) view that are in harmony.]

  • Fan, J. and Lv, J. (2008). Sure independence screening for ultrahigh dimensional feature space (with discussion). Journal of the Royal Statistical Society Series B 70, 849–911. [PDF]

    [Independence learning has long been applied widely and routinely when the scale of the data set becomes excessively large. What are the theoretical foundations for such a class of methods? This paper provides some surprising insights into this question and initiates a new line of statistical thinking on large-scale statistical learning.]

  • Fan, J., Fan, Y. and Lv, J. (2008). High dimensional covariance matrix estimation using a factor model. Journal of Econometrics 147, 186–197. [PDF]

    [The simplest framework of low-rank plus sparse structure on the covariance matrix is induced by the use of a factor model. What are the fundamental differences between large covariance matrix estimation and large precision matrix estimation in such a context? This paper provides some surprising insights into this question.]