Backward Elimination Algorithm for High Dimensional Variable Screening
In recent times, variable selection in high-dimensional data has become a challenging problem. We investigate here a popular but classical variable screening method, the Back- ward Elimination (BE) in a high dimensional setup (small-n-large P). The BE method as a variable screening method reduces the dimension of small-n-large P data into a lower dimensional data and then established shrinkage methods such as: LASSO, SCAD and MCP can be applied directly. To overcome the problems in high dimensional data, Chen and Chen (2008) recently developed a family of Extended Bayesian Information Criterion (EBIC) which is consistent with finite sample properties (Chen and Chen, 2008) which we used in this study to select the best candidate model from the models generated by the proposed BE method. We compare the BE with other screening methods such as: Sure Independence Screening(SIS), Iterative Sure Independence Screening and Forward Regression (FR) in simulation studies and real-data analysis to illustrate the selection consistency of our proposed BE method. Our numerical analysis reveals that the BE with EBIC can identify all important variables with high coverage probability, low false discovery rate and a very good model size with high signal-to-noise.^
Foli, Sophia Korkor, "Backward Elimination Algorithm for High Dimensional Variable Screening" (2018). ETD Collection for University of Texas, El Paso. AAI10840281.