Vous êtes sur la page 1sur 15

Dendrite Morphological Neurons Trained by

Stochastic Gradient Descent


Erik Zamora1 & Humberto Sossa2
Instituto Politcnico Nacional
UPIITA1, CIC2, Mxico City
ezamorag@ipn.mx

Grants:
SIP-IPN [20160945, 20161116]
CONACYT [155014, 65]

Goal

To develop a theoretical and practical framework


for morphological neurons

Classical Perceptrons

Monolayer

Multilayer

Dendrite Morphological Neuron (DMN)

Decision Boundary for DMN

Motivation

Elimination method, Ritter et al. 2003

Motivation

Sossa & Guevara, 2014

DMN-SGD: Architecture

https://github.com/ezamorag/DMN_by_SGD

Cost Function

DMN-SGD: Initialization and Gradient

Initialization Methods

Gradient

DMN-SGD: Comparison with Other DMN Training Methods

DMN-SGD: Comparison with Other Machine Learning Methods

DMN vs MLP

Conclusion and Future Work

Better classification accuracies than the training methods based solely on


heuristics
These heuristics can be useful to initialize dendrite parameters before optimizing
them
These DMNs trained by SGD can compete with popular machine learning
algorithms like SVM, MLP and RBFN.

Advantages over MLP:


Smarter parameter initialization
Similar performance without layer for feature extraction
Solve some problems much better

References
[1] Ritter, G. X., Sussner, P., Aug 1996. An introduction to morphological neural networks. In: Pattern Recognition, 1996., Proceedings of the
13th 536 International Conference on. Vol. 4. pp. 709717 vol.4.
[2] Sussner, P., Sep 1998. Morphological perceptron learning. In: Intelligent 570 Control (ISIC), 1998. Held jointly with IEEE International
Symposium on Computational Intelligence in Robotics and Automation (CIRA), Intelligent Systems and Semiotics (ISAS), Proceedings. pp.
477482.
[3] Pessoa, L. F., Maragos, P., 2000. Neural networks with hybrid morphological/rank/linear nodes: a unifying framework with applications to
handwritten character recognition. Pattern Recognition 33 (6), 945 960.
[4] Ritter, G. X., Urcid, G., Mar 2003. Lattice algebra approach to single neuron computation. IEEE Transactions on Neural Networks 14 (2), 282295.
[5] Barmpoutis, A., Ritter, G. X., 2006. Orthonormal basis lattice neural networks. In: Fuzzy Systems, 2006 IEEE International Conference on. pp.
331-336.
[6] Ritter, G. X., Urcid, G., Juan-Carlos, V. N., July 2014. Two lattice metrics dendritic computing for pattern recognition. In: Fuzzy Systems
(FUZZIEEE), 2014 IEEE International Conference on. pp. 4552.
[7] Sussner, P., Esmi, E. L., 2011. Morphological perceptrons with competitive learning: Lattice-theoretical framework and constructive learning
algorithm. Information Sciences 181 (10), 1929 1950, special Issue on Information Engineering Applications Based on Lattices.
[8] Sossa, H., Guevara, E., 2014. Efficient training for dendrite morphological neural networks. Neurocomputing 131, 132 142.
[9] Ritter, G. X., Urcid, G., 2007. Computational Intelligence Based on Lattice Theory. Springer Berlin Heidelberg, Berlin, Heidelberg, Ch. Learning
in Lattice Neural Networks that Employ Dendritic Computing, pp. 25-44.
[10] Zamora & Sossa, Regularized Divide and Conquer Training for Dendrite Morphological Neurons, Mecatrnica y Robtica de Servicio:
Teora y Aplicaciones, 2017
[11] Zamora & Sossa, Dendrite Morphological Neurons Trained by Stochastic Gradient Descent, IEEE Symposium Series on Computational
Intelligence, Greece, December 2016.
[12] Arce, Zamora, Barron & Sossa, Dendrite Morphological Neurons Trained by Differential Evolution, IEEE Symposium Series on
Computational Intelligence, Greece, December 2016.
[13] Zamora & Sossa, Dendrite Morphological Neurons Trained by Stochastic Gradient Descent, (submitted to neurocomputing)

Thanks

Comments or Questions?
Erik Zamora

ezamora1981@gmail.com

Vous aimerez peut-être aussi