Call for Paper - January 2023 Edition
IJCA solicits original research papers for the January 2023 Edition. Last date of manuscript submission is December 20, 2022. Read More

Weight Optimize by Automatic Unsupervised Clustering using Computation Intelligence

International Journal of Computer Applications
© 2012 by IJCA Journal
Volume 50 - Number 21
Year of Publication: 2012
C. Lowongtrakool
N. Hiransakolwong

C Lowongtrakool and N Hiransakolwong. Article: Weight Optimize by Automatic Unsupervised Clustering using Computation Intelligence. International Journal of Computer Applications 50(21):37-41, July 2012. Full text available. BibTeX

	author = {C. Lowongtrakool and N. Hiransakolwong},
	title = {Article: Weight Optimize by Automatic Unsupervised Clustering using Computation Intelligence},
	journal = {International Journal of Computer Applications},
	year = {2012},
	volume = {50},
	number = {21},
	pages = {37-41},
	month = {July},
	note = {Full text available}


Several techniques are applied to the unsupervised clustering data analysis. The entered data is dataset of input without aclass of answer. Besides, the beginning weight and the values of cluster groups of answers are defined. However, the most important parameter among these three factors (unsupervised clustering, weight, and the number of clusters) is the determination of beginning weight for the system. If the weight is well determined after the starting point, the system will be able to calculate track and figure out the answers more rapidly and precisely. Therefore, this paper proposes the method to optimize the weight of the system by conducting a technique of computational intelligence to manage the unsupervised clustering data analysis. The experiment starts from finding the value of beginning weight and then it is processed later by using sample datasets from UCI Machine Learning Repository including iris, balance and wine. The result shows that the efficiency of data classification increases to 99. 3%, 83. 6% and 47. 0%, respectively, and finding automatically the initial number of cluster k. Consequently, the outcome reduces the number of predicting clusters to discover approximate answer as well.


  • A. Abraham, Meta learning evolutionary artificial neural networks, Neurocomputing, 2004, Vol. 56, pp. 1–38.
  • A. J. Al-Shareef and M. F. Abbod, Neural networks initial weights optimisation, in Proceedings of the 12th International Conference on Modelling and Simulation (UKSim '10), 2010, pp. 57–61.
  • A. Patrikainen and M. Meila, Comparing subspace clusterings, IEEE Transactions on Knowledge and Data Engineering, 2006, 18(7),pp. 902–916.
  • Center for Machine Learning and Intelligent Systems, UCI Machine Learning Repository(2011),http://archive. ics. uci. edu/ml/.
  • C. zhang, H. Shao, Y. Li,Particle swarm optimization for evolving artificial neural network, IEEE Intl. Conf. on Systems2000, Vol. 4,pp. 2487 – 2490.
  • E. Atashpaz Gargari, C. Lucas,Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition, IEEE Congress on Evolutionary Computation, 2007,pp. 4661-4667.
  • S. -J. Han, S. -B. Cho,Evolutionary neural networks for anomaly detection based on the behavior of a program, IEEE Trans. Systems,2006,Vol. 36,pp. 559-570.
  • E. Muller, S. Gunnemann, I. Assent, and T. Seidl. Evaluating clustering in subspace projections of high dimensional data. PVLDB, 2009, 2(1),pp. 1270–1281.
  • G. Yen and P. Meesad,Pattern classification by an incremental learning fuzzy neural network, in Proc. IJCNN, 1999, pp. 3230-3235.
  • G. Yen and P. Meesad, Constructing a fuzzy expert system using the ILFN network and the genetic algorithm, in Proc. IEEE Inter. Con. Syst. ManCybern. , 2000, pp. 1917-1922.
  • A. Kizilay. and Makal. S,A neural network solution for identification and classification of cylindrical targets above perfectly conducting flat surfaces, J. of Electromagn. Waves and Appl. 21(2007)2147–2156, doi:10. 1163/156939307783152759
  • T. Kohonen and P. Somervuo ,Self-Organizing Maps of Symbol Strings with Application to Speech Recognition, Neurocomputing, 1998,Vol. 21, No. 1-3, pp. 19-30.
  • K. Miao, F. Chen and Z. G. Zhao, Stock price forecast based on bacterial colony RBF neural network, J. QingDao University. 20 (2007)50–54,doi: CNKI:SUN:QDDD. 0. 2007-02-011.
  • M. C. P. de Souto, A. Yamazaki and T. B. Ludernir, Optimization of neural network weights and architecture for odor recognition using simulated annealing, in Proc. 2002 Intl. Joint Conf. on Neural Networks, 2002,Vol. 1, pp. 547–552.
  • M. A. Mohamed, E. A. Soliman, and M. A. El-Gamal, Optimization and characterization of electromagnetically coupled patch an-tennas using RBF neural networks, J. of Electromagn. Waves and Appl. 20 (2006)1101–1114.
  • M. Yang, D. Kriegman, and N. Ahuja,Detecting Faces in Images: A Survey,IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002,vo1. 24, no. 1, pp. 34-58.
  • R. Poli, J. Kennedy, and T. Blackwell, Particle swarm optimization, Swarm Intelligence, 2007,vol. 1, pp. 33–37.
  • T. Kohonen, Self-Organizing Maps, Springer Verlag( Berlin, 2001).
  • W. M. Jenkins,Neural network weight training by mutation, J. Computers and Structures. 84 (2006) 2107-2112,doi: 10. 1016/j. compstruc. 2006. 08. 066
  • X. He, J. Zeng, J. Jie, Artificial neural network weights optimization design based on MEC algorithm,Conf. on Machine Learning and Cybernetics, 2004,Vol. 6,pp. 3361 – 3364.
  • Y. Lee, S. H. Oh, and M. W. Kim,The effect of initial weights on premature saturation in back-propagation learning, in Proceedings of the International Joint Conference on Neural Networks,1991, pp. 765–770.