An Information Bottleneck Approach for Multi-View Feature Selection
36 Pages Posted: 27 Jan 2023
Abstract
Feature selection has been studied extensively over the last few decades. As a widely used method, the information-theoretic feature selection methods have attracted considerable attention due to their better interpretation and desirable performance. From an information-theoretic perspective, a golden rule for feature selection is to maximize the mutual information I(Xs, Y) between the selected feature subset Xs and the class labels Y. Despite its simplicity, explicitly optimizing this objective is a non-trivial task. In this work, we propose a novel global neural network-based feature selection framework with the information bottleneck (IB) principle and establish its connection to the rule of maximizing I(Xs, Y) . Using the matrix-based Rényi's α-order entropy functional, our framework enjoys a simple and tractable objective without any variational approximation or distributional assumption. We further extend the framework to multi-view scenarios and verify it with two large-scale, high-dimensional real-world biomedical applications. Comprehensive experimental results demonstrate the superior performance of our framework not only in terms of classification accuracy but also in terms of good interpretability within and across each view, effectively proving that the proposed framework is trustworthy.
Keywords: Feature selection, Neural Network, information bottleneck, interpretability, multi-view learning
Suggested Citation: Suggested Citation