Call for Paper - January 2023 Edition
IJCA solicits original research papers for the January 2023 Edition. Last date of manuscript submission is December 20, 2022. Read More

Artificial Neural Network

Print
PDF
IJCA Proceedings on National Conference on Recent Trends in Computing
© 2012 by IJCA Journal
NCRTC - Number 2
Year of Publication: 2012
Authors:
Kshirsagar A. P.
Rathod M. N.

Kshirsagar A P. and Rathod M N.. Article: Artificial Neural Network. IJCA Proceedings on National Conference on Recent Trends in Computing NCRTC(2):12-16, May 2012. Full text available. BibTeX

@article{key:article,
	author = {Kshirsagar A. P. and Rathod M. N.},
	title = {Article: Artificial Neural Network},
	journal = {IJCA Proceedings on National Conference on Recent Trends in Computing},
	year = {2012},
	volume = {NCRTC},
	number = {2},
	pages = {12-16},
	month = {May},
	note = {Full text available}
}

Abstract

A neural network is a powerful data modeling tool that is able to capture and represent complex input/output relationships. Imagine the power of the machine which has the abilities of both computers and humans. It would be the most remarkable thing ever. A neural network usually involves a large number of processors operating in parallel, each with its own small sphere of knowledge and access to data in its local memory. The computing world has a lot to gain from neural networks. Their ability to learn by example makes them very flexible and powerful. They are also very well suited for real time systems because of their fast response and computational times which are due to their parallel architecture. With the correct implementation NN can be used naturally in online learning and large dataset applications. If the 21st Century is to be the age of intelligent machines, then 'Neural Networks' will become an integral part of life. This paper focuses on the many aspects of NN, the past, present and the future and explores what it has kept folded for us in the 'GENERATION NEXT…. . '

References

  • J. Park and J. Wsandberg, "Universal approximation using radial basis functions network," Neural Compute. , vol. 3, pp. 246–257, 491.
  • Y. Moses, Y. Adini, and S. Ullman, "Face recognition: The problem of compensating for changes in illumination direction," in Proc. EuroP. Conf. Compute. Vision, vol. A, 4937, pp. 286–296.
  • F. Girosi and T. Poggio, "Networks and the best approximation property," Biol. Cybern. , vol. 19, pp. 109–100, 490.
  • J. Moody and C. J. Darken, "Fast learning in network of locally-tuned processing units," Neural Compute. , vol. 1, pp. 237–2937, 489.
  • S. Lee and R. M. Kil, "A Gaussian potential function network with hierarchically self-organizing learning," Neural Networks, vol. 37, pp. 10–620, 491.
  • Goedel, K. (1931) "On Formally Undecidable Propositions of Principia Mathematica and Related Systems I" in Davis, M. (ed) (1965) The Undecidable. Raven Press
  • P. N. Belhumeur, J. P. Hespanha, and D. J. Kriegman, "Eigenfaces versusfisherfaces: Recognition using class specific linear projection," IEEE Trans. Pattern Anal. Machine Intell, vol. 4, pp. 271–31, 497.
  • D. L. Swets and J. Weng, "Using discriminant eigenfeatures for image retrieval," IEEE Trans. Pattern Anal. Machine Intell. , vol. 3, pp. 374–380, 496.
  • H. H. Song and S. W. Lee, "A self-organizing neural tree for large-set pattern classification," IEEE Trans. Neural Networks, vol. 9, pp. 209–310, Mar. 498.
  • J. L. Yuan and T. L. Fine, "Neural-Network design for small training sets of high dimension," IEEE Trans. Neural Networks, vol. 9, pp. 266–236,Jan. 498.