1. bookVolume 8 (2018): Edizione 1 (January 2018)
Dettagli della rivista
License
Formato
Rivista
eISSN
2449-6499
Prima pubblicazione
30 Dec 2014
Frequenza di pubblicazione
4 volte all'anno
Lingue
Inglese
Accesso libero

Self-Assimilation for Solving Excessive Information Acquisition in Potential Learning

Pubblicato online: 01 Nov 2017
Volume & Edizione: Volume 8 (2018) - Edizione 1 (January 2018)
Pagine: 5 - 29
Ricevuto: 31 Mar 2017
Accettato: 19 Apr 2017
Dettagli della rivista
License
Formato
Rivista
eISSN
2449-6499
Prima pubblicazione
30 Dec 2014
Frequenza di pubblicazione
4 volte all'anno
Lingue
Inglese

[1] R. Linsker, Self-organization in a perceptual network, Computer, vol. 21, no. 3, pp. 105–117, 1988.10.1109/2.36Search in Google Scholar

[2] R. Linsker, How to generate ordered maps by maximizing the mutual information between input and output signals, Neural computation, vol. 1, no. 3, pp. 402–411, 1989.10.1162/neco.1989.1.3.402Search in Google Scholar

[3] R. Linsker, Local synaptic learning rules suffice to maximize mutual information in a linear network, Neural Computation, vol. 4, no. 5, pp. 691–702, 1992.10.1162/neco.1992.4.5.691Apri DOISearch in Google Scholar

[4] R. Linsker, Improved local learning rule for information maximization and related applications, Neural networks, vol. 18, no. 3, pp. 261–265, 2005.10.1016/j.neunet.2005.01.002Search in Google Scholar

[5] G. Deco, W. Finnoff, and H. Zimmermann, Unsupervised mutual information criterion for elimination of overtraining in supervised multilayer networks, Neural Computation, vol. 7, no. 1, pp. 86–107, 1995.10.1162/neco.1995.7.1.86Search in Google Scholar

[6] G. Deco and D. Obradovic, An information-theoretic approach to neural computing, Springer Science & Business Media, 2012.Search in Google Scholar

[7] H. B. Barlow, Unsupervised learning, Neural computation, vol. 1, no. 3, pp. 295–311, 1989.10.1162/neco.1989.1.3.295Search in Google Scholar

[8] H. B. Barlow, T. P. Kaushal, and G. J. Mitchison, Finding minimum entropy codes, Neural Computation, vol. 1, no. 3, pp. 412–423, 1989.10.1162/neco.1989.1.3.412Search in Google Scholar

[9] J. J. Atick, Could information theory provide an ecological theory of sensory processing?, Network: Computation in neural systems, vol. 3, no. 2, pp. 213–251, 1992.10.1088/0954-898X_3_2_009Search in Google Scholar

[10] Z. Nenadic, Information discriminant analysis: Feature extraction with an information-theoretic objective, Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 29, no. 8, pp. 1394–1407, 2007.Search in Google Scholar

[11] J. C. Principe, D. Xu, and J. Fisher, Information theoretic learning, Unsupervised adaptive filtering, vol. 1, pp. 265–319, 2000.Search in Google Scholar

[12] J. C. Principe, Information theoretic learning: Renyi’s entropy and kernel perspectives, Springer Science & Business Media, 2010.10.1007/978-1-4419-1570-2Search in Google Scholar

[13] K. Torkkola, Feature extraction by non parametric mutual information maximization, The Journal of Machine Learning Research, vol. 3, pp. 1415–1438, 2003.Search in Google Scholar

[14] R. Kamimura, Simple and stable internal representation by potential mutual information maximization, in International Conference on Engineering Applications of Neural Networks, pp. 309–316, Springer, 2016.10.1007/978-3-319-44188-7_23Search in Google Scholar

[15] R. Kamimura, Self-organizing selective potentiality learning to detect important input neurons, in Systems, Man, and Cybernetics (SMC), 2015 IEEE International Conference on, pp. 1619–1626, IEEE, 2015.Search in Google Scholar

[16] R. Kamimura, Collective interpretation and potential joint information maximization, in Intelligent Information Processing VIII: 9th IFIP TC 12 International Conference, IIP 2016, Melbourne, VIC, Australia, November 18-21, 2016, Proceedings, pp. 12–21, Springer, 2016.10.1007/978-3-319-48390-0_2Search in Google Scholar

[17] R. Kamimura, Repeated potentiality assimilation: Simplifying learning procedures by positive, independent and indirect operation for improving generalization and interpretation (in press), in Proc. of IJCNN-2016, (Vancouver), 2016.10.1109/IJCNN.2016.7727282Search in Google Scholar

[18] R. Kamimura and T. Kamimura, Structural information and linguistic rule extraction, in Proceedings of ICONIP, pp. 720–726, 2000.Search in Google Scholar

[19] R. Kamimura, T. Kamimura, and O. Uchida, Flexible feature discovery and structural information control, Connection science, vol. 13, no. 4, pp. 323–347, 2001.10.1080/09540090110108679Apri DOISearch in Google Scholar

[20] R. Kamimura, Information-theoretic competitive learning with inverse euclidean distance output units,” Neural processing letters, vol. 18, no. 3, pp. 163–204, 2003.10.1023/B:NEPL.0000011136.78760.22Apri DOISearch in Google Scholar

Articoli consigliati da Trend MD

Pianifica la tua conferenza remota con Sciendo