A Bootstrap Based Neyman-Pearson Test for Identifying Variable Importance

Gregory Ditzler, Robi Polikar, Gail Rosen

    Research output: Contribution to journalArticle

    17 Scopus citations

    Abstract

    Selection of most informative features that leads to a small loss on future data are arguably one of the most important steps in classification, data analysis and model selection. Several feature selection (FS) algorithms are available; however, due to noise present in any data set, FS algorithms are typically accompanied by an appropriate cross-validation scheme. In this brief, we propose a statistical hypothesis test derived from the Neyman-Pearson lemma for determining if a feature is statistically relevant. The proposed approach can be applied as a wrapper to any FS algorithm, regardless of the FS criteria used by that algorithm, to determine whether a feature belongs in the relevant set. Perhaps more importantly, this procedure efficiently determines the number of relevant features given an initial starting point. We provide freely available software implementations of the proposed methodology.

    Original languageEnglish (US)
    Article number6823119
    Pages (from-to)880-886
    Number of pages7
    JournalIEEE Transactions on Neural Networks and Learning Systems
    Volume26
    Issue number4
    DOIs
    StatePublished - Apr 1 2015

    All Science Journal Classification (ASJC) codes

    • Software
    • Computer Science Applications
    • Computer Networks and Communications
    • Artificial Intelligence

    Fingerprint Dive into the research topics of 'A Bootstrap Based Neyman-Pearson Test for Identifying Variable Importance'. Together they form a unique fingerprint.

  • Cite this