Acceso abierto

Supposed Maximum Mutual Information for Improving Generalization and Interpretation of Multi-Layered Neural Networks

  
31 dic 2018

Cite
Descargar portada

The present paper1 aims to propose a new type of information-theoretic method to maximize mutual information between inputs and outputs. The importance of mutual information in neural networks is well known, but the actual implementation of mutual information maximization has been quite difficult to undertake. In addition, mutual information has not extensively been used in neural networks, meaning that its applicability is very limited. To overcome the shortcoming of mutual information maximization, we present it here in a very simplified manner by supposing that mutual information is already maximized before learning, or at least at the beginning of learning. The method was applied to three data sets (crab data set, wholesale data set, and human resources data set) and examined in terms of generalization performance and connection weights. The results showed that by disentangling connection weights, maximizing mutual information made it possible to explicitly interpret the relations between inputs and outputs.

Idioma:
Inglés
Calendario de la edición:
4 veces al año
Temas de la revista:
Informática, Bases de datos y minería de datos, Inteligencia artificial