Call for Paper - January 2023 Edition
IJCA solicits original research papers for the January 2023 Edition. Last date of manuscript submission is December 20, 2022. Read More

Classical Analysis over Credit Allotment Scheme using Classification under Machine Learning Domain

Print
PDF
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Year of Publication: 2018
Authors:
Satyam S. Sundaram, Pradeep Pant
10.5120/ijca2018916864

Satyam S Sundaram and Pradeep Pant. Classical Analysis over Credit Allotment Scheme using Classification under Machine Learning Domain. International Journal of Computer Applications 180(32):35-41, April 2018. BibTeX

@article{10.5120/ijca2018916864,
	author = {Satyam S. Sundaram and Pradeep Pant},
	title = {Classical Analysis over Credit Allotment Scheme using Classification under Machine Learning Domain},
	journal = {International Journal of Computer Applications},
	issue_date = {April 2018},
	volume = {180},
	number = {32},
	month = {Apr},
	year = {2018},
	issn = {0975-8887},
	pages = {35-41},
	numpages = {7},
	url = {http://www.ijcaonline.org/archives/volume180/number32/29253-2018916864},
	doi = {10.5120/ijca2018916864},
	publisher = {Foundation of Computer Science (FCS), NY, USA},
	address = {New York, USA}
}

Abstract

As the amount of data is increasing over the tremendous rate, it is extremely viable to imply smart analysis. It deals with optimization of performance criterion dealing with examples relevant to present and past situations. Learning plays a vital role in making predictions from analysis of data set properties. Amongst the various applications in the domain of learning we have a training data set over which learning is implied as the data that is collected may contain irrelevant features that are avoidable in our process and also do not contribute towards learning .We also ensure that the selected data set suits our purpose of predicting futuristic events and unseen samples. However while dealing with problems of classification in machine learning we need to determine and draw observations relevant to a problem statement having disjoint set of training data. Mining information from data enrolls classification, clustering and other such methodologies as its subsets. The paper presents a classical descriptive procedure to compare various classification schemes under single roof and draw analysis over the best scorer in terms of accuracy to draw predictions of credit allotment to customer problem. Various data sets can be filtered by the approached schemes to make decision during binary and multi valued classification. However the paper ranks one over other to determine the best fit choice in terms of performance measures.

References

  1. http://disp.ee.ntu.edu.tw/~pujols/Machine%20Learning%20Tutorial.pdf
  2. http://cmapspublic3.ihmc.us/rid=1MSXWCGGP-1C6ZQFT-14FT/Lee_CART_MARS.pdf
  3. Chen, M.S., Han, J.,Yu, P.S.”Data mining: an overview from a database perspective”. IEEE Trans. Knowledge Data Eng. Vol,8,1993
  4. Cheng, B., Titterington, D.M, “Neural network: a review from a statistical perspective “Statist. Sci.,Vol.9,1994.
  5. K. Huang, H. Yang, I. King, and M. R. Lyu, ``Local learning vs. global learning: An introduction to maxi-min margin machine,'' in Support Vector Machines: Theory and Applications. Berlin, Germany: Springer, 2005, pp. 113_131.
  6. E. E. Elattar, J. Goulermas, and Q. H. Wu, ``Electric load forecasting based on locally weighted support vector regression,'' IEEE Trans. Syst., Man, Cybern. C, Appl. Rev., vol. 40, no. 4, pp. 438_447, Jul. 2010.
  7. K. Grolinger, M. A. M. Capretz, and L. Seewald, ``Energy consumption prediction with big data: Balancing prediction accuracy and computational resources,'' in Proc. IEEE Int. Congr. Big Data (BigData Congress), Jun. 2016, pp. 157_164.
  8. T. Onoda, G. R¨atsch, and K.-R. M¨uller. A non-intrusive monitoring system for household electric appliances with inverters. In H. Bothe and R. Rojas, editors, Proc. of NC’2000, Berlin, 2000. ICSC Academic Press Canada/Switzerland.
  9. W. Watanabe. Pattern recognition: Human and mechanical. Wiley, 1985.
  10. E. Yom-Tov. An introduction to pattern classification. In U. von Luxburg, O. Bousquet, and G. R¨atsch, editors, Advanced Lectures on Machine Learning, volume 3176 of LNAI, pages1–23. Springer, 2004.
  11. L. Yang, Y. Chu, J. Zhang, L. Xia, Z. Wang, and K.-L. Tan, ``Transfer learning over big data,'' in Proc. 10th Int. Conf. Digit. Inf. Man- age. (ICDIM), Oct. 2015, pp. 63_68.
  12. S. Thrun and L. Pratt, Learning to Learn. Norwell, MA: Kluwer, 1998. D. L. Silver, Q. Yang, and L. Li, ``Lifelong machine learning systems: Beyond learning algorithms,'' in Proc. AAAI Spring Symp., 2013, pp. 49_55.
  13. M. T. Khan, M. Durrani, S. Khalid, and F. Aziz, ``Lifelong aspect extraction from big data: Knowledge engineering,'' Complex Adapt. Syst. Model., vol. 4, no. 1, pp. 1_15, 2016.
  14. Z. Chen and B. Liu, ``Topic modeling using topics from many domains, lifelong learning and big data,'' in Proc. 31st Int. Conf. Mach. Learn., 2014, pp. 703_711.
  15. S. Suthaharan, ``Big data classi_cation: Problems and challenges in network intrusion prediction with machine learning,'' ACM SIGMETRICS Perform. Eval. Rev., vol. 41, no. 4, pp. 70_73, 2014.
  16. T. Dietterich, ``Ensemble methods in machine learning,'' in Multiple Classi_er Systems, vol. 1857. London, U.K.: Springer-Verlag, 2000,pp 1.15
  17. Jerome Friedman, Trevor Hastie, Robert Tibshirani, “Additive Logistic Regression: A statistical view of Boosting”, The Annals of Statistics 2000, Vol 28, No.2, 337-407.
  18. (2000)Gradient boosting. On Wikipedia the free encyclopedia. Available: http://en.wikipedia.org/wiki/Gradient_boosting
  19. Shotaro Matsumoto, Hiroya Takamura, and Manabu Okumura “Sentiment Classification Using Word Sub-sequences and Dependency Sub-trees” Advances in Knowledge Discovery and Data Mining ,301-311, 9th Pacific-Asia Conference, PAKDD 2005, Hanoi, Vietnam, May 18-20, 2005. Proceedings
  20. Rui Xia, Chengqing Zong ,Shoushan Li ,“ Ensemble of feature sets and classification algorithms for sentiment classification” Information Sciences, Volume 181, Issue 6, 15 March 20yy11, Pages 1138–1152

Keywords

Learning, Features, Classification