
Software
ISEE 

Implements the innovated scalable efficient estimation (ISEE) procedure (Fan and Lv, 2016) for scalable estimation of both the innovated data matrix and the precision matrix in ultralarge scales.[R code]
The ISEE framework has a range of applications involving precision (innovated data) matrix including:
Linear and nonlinear classifications;
Dimension reduction;
Portfolio management;
Feature screening and simultaneous inference.
Reference:
Fan, Y. and Lv, J. (2016).
Innovated scalable efficient estimation in ultralarge Gaussian graphical models.
The Annals of Statistics 44, 20982126.


IIS 

Implements the innovated interaction screening (IIS) followed by the sparse quadratic discriminant analysis (SQDA), which is a twostep procedure IISSQDA (Fan, Kong, Li and Zheng, 2015) for interaction screening and selection in highdimensional nonlinear classification, with the innovated transformation of the feature vector estimated by the ISEE (Fan and Lv, 2016).[R code]
References:
Fan, Y., Kong, Y., Li, D. and Zheng, Z. (2015).
Innovated interaction screening for highdimensional nonlinear classification.
The Annals of Statistics 43, 12431272.
Fan, Y. and Lv, J. (2016).
Innovated scalable efficient estimation in ultralarge Gaussian graphical models.
The Annals of Statistics 44, 20982126. 

RTPS 

Fits highdimensional sparse generalized linear models (linear, logistic, and Poisson) with regularization methods in thresholded parameter space (RTPS, Fan and Lv, 2013), where the connections and differences of a spectrum of regularization methods are established.[Matlab code]
Reference:
Fan, Y. and Lv, J. (2013).
Asymptotic equivalence of regularization methods in thresholded parameter space.
Journal of the American Statistical Association 108, 10441061.


SICA 

Fits highdimensional sparse linear regression and also produces solution to sparse recovery using the iterative coordinate ascent (ICA) algorithm (Fan and Lv, 2011), with the smooth integration of counting and absolute deviation (SICA) method (Lv and Fan, 2009). SICA is a regularization method for highdimensional sparse modeling and sparse recovery. It provides a family of concave penalties that give a smooth homotopy between the L0 and L1penalties, where the former is the target penalty for sparse recovery and the latter is used in L1regularization methods such as the Lasso.[Matlab code]
References:
Lv, J. and Fan, Y. (2009).
A unified approach to model selection and sparse recovery using regularized least squares.
The Annals of Statistics 37, 34983528.
Fan, J. and Lv, J. (2011).
Nonconcave penalized likelihood with NPdimensionality.
IEEE Transactions on Information Theory 57, 54675484.


SIRS 

The sequentially and iteratively reweighted squares (SIRS) algorithm (Lv and Fan, 2009) implementing SICA for sparse recovery.[Matlab code]
Reference:
Lv, J. and Fan, Y. (2009).
A unified approach to model selection and sparse recovery using regularized least squares.
The Annals of Statistics 37, 34983528.


Note: Please contact me if anyone is interested in optimizing or developing packages based on these codes. 

