CFP last date
22 April 2024
Reseach Article

Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC)

by Gaurang Panchal, Amit Ganatra, Y.P.Kosta, Devyani Panchal
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 1 - Number 5
Year of Publication: 2010
Authors: Gaurang Panchal, Amit Ganatra, Y.P.Kosta, Devyani Panchal
10.5120/126-242

Gaurang Panchal, Amit Ganatra, Y.P.Kosta, Devyani Panchal . Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC). International Journal of Computer Applications. 1, 5 ( February 2010), 41-44. DOI=10.5120/126-242

@article{ 10.5120/126-242,
author = { Gaurang Panchal, Amit Ganatra, Y.P.Kosta, Devyani Panchal },
title = { Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC) },
journal = { International Journal of Computer Applications },
issue_date = { February 2010 },
volume = { 1 },
number = { 5 },
month = { February },
year = { 2010 },
issn = { 0975-8887 },
pages = { 41-44 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume1/number5/126-242/ },
doi = { 10.5120/126-242 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T19:44:22.093763+05:30
%A Gaurang Panchal
%A Amit Ganatra
%A Y.P.Kosta
%A Devyani Panchal
%T Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC)
%J International Journal of Computer Applications
%@ 0975-8887
%V 1
%N 5
%P 41-44
%D 2010
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. Neural networks are commonly used networks in many engineering applications due to its better generalization property. An ensemble neural network algorithm is proposed based on the Akaike information criterion (AIC). Ecologists have long relied on hypothesis testing to include or exclude variables in models, although the conclusions often depend on the approach used. The advent of methods based on information theory, also known as information-theoretic approaches, has changed the way we look at model selection The Akaike information criterion (AIC) has been successfully used in model selection. It is not easy to decide the optimal size of the neural network because of its strong nonlinearity. We discuss problems with well used information and propose a model selection method.

References
  1. S. Raksekaran, G.A. Vijayalakshmi.: Neural Network Fuzzy Logic and Genetic Algorithm Pai. PHI Publication (2005).
  2. Jaiwer Han, Micheline Kamber.: Data Mining, Concepts & Techniques Morgan. Kaufmamn Publication (2005).
  3. Pythia Version 1.02, The Neural Network Designer, http://www.runtime.org
  4. Alyuda NeuroIntelligence 2.2, http://www.aluyada.com.
  5. AIC,Wikipidia,http://www.wikipidia.com
  6. AIC, http://www.modelselection.org
  7. AIC, http://en.wikipedia.org/wiki/Residual_sum_of_squares
Index Terms

Computer Science
Information Sciences

Keywords

Neural Network Hidden Neurons Akaike's Information Criterion (AIC) Correct Classification Rate (CRR)