CFP last date
20 May 2024
Reseach Article

Growing Neural Networks using Soft Competitive Learning

by Vikas Chaudhary, Dr. Anil K. Ahlawat, Dr. R.S. Bhatia
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 21 - Number 3
Year of Publication: 2011
Authors: Vikas Chaudhary, Dr. Anil K. Ahlawat, Dr. R.S. Bhatia
10.5120/2495-3372

Vikas Chaudhary, Dr. Anil K. Ahlawat, Dr. R.S. Bhatia . Growing Neural Networks using Soft Competitive Learning. International Journal of Computer Applications. 21, 3 ( May 2011), 1-6. DOI=10.5120/2495-3372

@article{ 10.5120/2495-3372,
author = { Vikas Chaudhary, Dr. Anil K. Ahlawat, Dr. R.S. Bhatia },
title = { Growing Neural Networks using Soft Competitive Learning },
journal = { International Journal of Computer Applications },
issue_date = { May 2011 },
volume = { 21 },
number = { 3 },
month = { May },
year = { 2011 },
issn = { 0975-8887 },
pages = { 1-6 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume21/number3/2495-3372/ },
doi = { 10.5120/2495-3372 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:07:31.991730+05:30
%A Vikas Chaudhary
%A Dr. Anil K. Ahlawat
%A Dr. R.S. Bhatia
%T Growing Neural Networks using Soft Competitive Learning
%J International Journal of Computer Applications
%@ 0975-8887
%V 21
%N 3
%P 1-6
%D 2011
%I Foundation of Computer Science (FCS), NY, USA
Abstract

This paper gives an overview of some classical Growing Neural Networks (GNN) using soft competitive learning. In soft competitive learning each input signal is characterized by adapting in addition to the winner also some other neurons of the network. The GNN is also called the ANN with incremental learning. The artificial neural networks (ANN) mapping capability depends on the number of layers and the number of hidden layers in the structure of ANN. There is no formal way of computing network structure. Network structure is usually selected by trial-and-error method but it is time consuming process. Basically, we make use of two mechanisms that may modify the structure of the network: growth and pruning. In this paper, the competitive learning is firstly introduced; secondly the SOM topology and limitations of SOM are illustrated. Thirdly, a class of classical GNN with soft competitive learning is reviewed, such as Neural Gas Network (NGN), Growing Neural Gas (GNG), Self-Organizing Surfaces (SOS), Incremental Grid Growing (lGG), Evolve Self-Organizing Maps (ESOM), Growing Hierarchical Self-Organizing Map (GHSOM), and Growing Cell Structures (GCS).

References
  1. Xinjian Qiang, Guojian Cheng, Zheng Wang An Overview of Some Classical Growing Neural Networks and New Developments, 2nd International Conference on Education Technology and Computer (ICETC), IEEE 2010.
  2. Guojian Cheng, Ziqi Song, Jinquan Yang, Rongfang Gao, On Growing Self - Organizing Neural Networks without Fixed Dimensionality, International Conference CIMCAIAWTIC 06 IEEE 2006.
  3. B.Fritzke. Growing cell structures- a self-organizing network for unsupervised and supervised learning. Neural Networks, 7(9):1441-1460, 1994.
  4. Guojian Cheng, Tianshi Liu, Jiaxin Han, Zheng Wang. Towards Growing Self-Organizing Neural Networks with Fixed networks, Press of Xi' an Jiaotong University, 2008.10, ISBN 978-7-5605-2979-0 (in Chinese).
  5. B.Fritzke. Some competitive learning methods. http://www.neuro informatik.ruhr-unibochum.de /ini/ VDM/research/gsn/ JavaPaper/.
  6. Self-Organizing Neural Networks without Fixed Dimensionality, Proceedings of CIMCA2006 (International Conference on Computational Intelligence for Modeling, Control and Automation), Published by IEEE Computer Society Press.
  7. T. Martinetz, K. Schulten. ”Neural-Gas” network learns topologies. In Proc. International Conference on Artificial Neural Networks (Espoo, Finland), volume I, pages 397–402, Amsterdam, Netherlands, 1991.
  8. T. Martinetz, K. Schulten. Topology representing networks. Neural Networks, 7(2), 1994.
  9. T. Martinetz. Competitive Hebbian learning rule forms perfectly topology preserving maps. Int. Conf. on ANN, pages 427–434, London, UK, 1993, Springer.
  10. A.K. Qin, P.N. Suganthan. Robust growing neural gas algorithm with application in cluster analysis, Neural Networks, vol, 17, p.1135-1148, 2004.
  11. Y. Prudent, A. Ennaji. A new learning algorithm for incremental self-organizing maps, Verleysen M. (Eds), In Proceedings of the European Symposium on Artificial Neural Networks (ESANN), 2005.
  12. S. Furao, O. Hasegawa, A Self-organized Growing Network for On-line Unsupervised Learning, IEEE International Joint Conference on Neural Networks (IJCNN2004), CD-ROM ISBN 0-7803-8360-5, Vol.1, pp.11-16 (2004).
  13. K. Doherty, R. Adams, N. Davey, TreeGNG -Hierarchical Topological Clustering, Verleysen M.(Eds), In Proceedings of the European Symposium on Artificial Neural Networks (ESANN), 2005.
  14. A. Zell, h. Bayer, and h. Bauknecht, Similarity analysis of molecules with self-organizing surfaces—an extension of the self-organizing map. In Proc. icnn’94, International Conference on Neural Networks, pages 719–724, Piscataway, 1994, IEEE Computer Society.
  15. J. Blackmore. Visualizing high-dimensional structure with the incremental grid growing neural network. Technical Report AI 95-238, University of Texas, Austin, August 1, 1995.
  16. D. Deng and N. Kasabov. ESOM: An algorithm to evolve self-organizing maps from on-line data streams. In Proc. of the International Joint Conference on Neural Networks (IJCNN 2000), volume vi, pages 3 – 8, Como, Italy, July 24. –27. 2000. IEEE Computer Society.
  17. M. Dittenbach, D. Merkl and A. Rauber. The growing hierarchical self-organizing map. In Proc. of the International Joint Conference on Neural Networks (IJCNN 2000), volume vi, pages 15 – 19, Como, Italy, July 24. – 27, 2000, IEEE Computer Society.
  18. Guojian Cheng, Tianshi Liu, Jiaxin Han, and Zheng Wang, Towards Growing Self-Organizing Neural Networks with Fixed Dimensionality, World Academy of Science, Engineering and Technology 22, 2006.
  19. V Hodge, J. Austin. Hierarchical growing cell structures: TreeGCS. In IEEE TKDE Special Issue on Connectionist Models for Learning in Structured Domains.
  20. N. Vlassis, A. Dimopoulos, G. Papakonstantinou. The probabilistic growing cell structures algorithm. Lecture Notes in Computer Science, 1327:P649, 1997.
  21. J.Pakkanen, J. Iivarinen, E. Oja. The evolving tree - a novel self-organizing network for data analysis. Neural Processing Letters, 20(3):199-211, 2004.
  22. Y. Wang, C. Yang, K. Mathee, G. Narasimhan. Clustering using Adaptive Self-Organizing Maps (ASOM) and Applications. Proceedings of International Workshop on Bioinformatics Research and Applications, p944-951 Atlanta, Georgia, May 2005.
  23. R. Freeman and H.Yin, "Adaptive Topological Tree Structure (ATSS) for document organization and visualization," Neural Networks, Vol. 17, pp. 1255-1271, 2004.
Index Terms

Computer Science
Information Sciences

Keywords

Growing Neural Networks (GNN) Soft Competitive Learning Self Organizing Maps (SOM)