CFP last date
22 April 2024
Reseach Article

Coupled Kernel Ensemble Regression

by Dickson Keddy Wornyo, Elias Nii Noi Ocquaye, Bright Bediako-Kyeremeh
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 181 - Number 34
Year of Publication: 2018
Authors: Dickson Keddy Wornyo, Elias Nii Noi Ocquaye, Bright Bediako-Kyeremeh
10.5120/ijca2018918278

Dickson Keddy Wornyo, Elias Nii Noi Ocquaye, Bright Bediako-Kyeremeh . Coupled Kernel Ensemble Regression. International Journal of Computer Applications. 181, 34 ( Dec 2018), 1-8. DOI=10.5120/ijca2018918278

@article{ 10.5120/ijca2018918278,
author = { Dickson Keddy Wornyo, Elias Nii Noi Ocquaye, Bright Bediako-Kyeremeh },
title = { Coupled Kernel Ensemble Regression },
journal = { International Journal of Computer Applications },
issue_date = { Dec 2018 },
volume = { 181 },
number = { 34 },
month = { Dec },
year = { 2018 },
issn = { 0975-8887 },
pages = { 1-8 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume181/number34/30207-2018918278/ },
doi = { 10.5120/ijca2018918278 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:08:04.666337+05:30
%A Dickson Keddy Wornyo
%A Elias Nii Noi Ocquaye
%A Bright Bediako-Kyeremeh
%T Coupled Kernel Ensemble Regression
%J International Journal of Computer Applications
%@ 0975-8887
%V 181
%N 34
%P 1-8
%D 2018
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this paper, the concept of kernel ensemble regression scheme is enhanced considering the absorption of multiple kernel regrssors into a unified ensemble regression framework simultaneously and coupled by minimizing total loss of ensembles in Reproducing kernel Hilbert Space. By this, one kernel regressor with more accurate fitting precession on data can automatically obtain bigger weight, which leads to a better overall ensemble performance. Comparing several single and ensemble regression methods such as Gradient Boosting, Support Vector Regression, Ridge Regression, Tree Regression and Random Forest with our proposed method, the experimental results of the proposed model indicates the highest performances in terms with regression and classification tasks using several UCI dataset.

References
  1. h49, author=Wornyo, Dickson Keddy and Shen, Xiang-Jun and Dong, Yong and Wang, Liangjun and Huang, Shu- Cheng, journal=World Wide Web, pages=1–18, year=2018, publisher=Springer.
  2. Xiangyu Chang, Shao-Bo Lin, and Ding-Xuan Zhou. Distributed semi-supervised learning with kernel ridge regression. Journal of Machine Learning Research, 18(46):1–22, 2017.
  3. Luefeng Chen, Mengtian Zhou, MinWu, Jinhua She, Zhentao Liu, Fangyan Dong, and Kaoru Hirota. Three-layer weighted fuzzy support vector regression for emotional intention understanding in human-robot interaction. IEEE Transactions on Fuzzy Systems, 2018.
  4. Kai Cheng and Zhenzhou Lu. Adaptive sparse polynomial chaos expansions for global sensitivity analysis based on support vector regression. Computers & Structures, 194:86–96, 2018.
  5. R Dennis Cook and Liliana Forzani. Big data and partial leastsquares prediction. Canadian Journal of Statistics, 46(1):62– 78, 2018.
  6. Kamalika Das and Ashok N Srivastava. Sparse inverse kernel gaussian process regression. Statistical Analysis and Data Mining: The ASA Data Science Journal, 6(3):205–220, 2013.
  7. Harris Drucker, Christopher JC Burges, Linda Kaufman, Alex J Smola, and Vladimir Vapnik. Support vector regression machines. In Advances in neural information processing systems, pages 155–161, 1997.
  8. Charles W Edmunds, Choo Hamilton, Keonhee Kim, Nicolas Andre, and Nicole Labbe. Rapid detection of ash and inorganics in bioenergy feedstocks using fourier transform infrared spectroscopy coupled with partial least-squares regression. Energy & Fuels, 31(6):6080–6088, 2017.
  9. Siddharth Hariharan, Siddhesh Tirodkar, Alok Porwal, Avik Bhattacharya, and Aurore Joly. Random forest-based prospectivity modelling of greenfield terrains using sparse deposit data: an example from the tanami region, western australia. Natural Resources Research, 26(4):489–507, 2017.
  10. Md Al Mehedi Hasan, Mohammed Nasser, Biprodip Pal, and Shamim Ahmad. Support vector machine and random forest modeling for intrusion detection system (ids). Journal of Intelligent Learning Systems and Applications, 6(01):45, 2014.
  11. Justin Heinermann and Oliver Kramer. Precise wind power prediction with svm ensemble regression. In International Conference on Artificial Neural Networks, pages 797–804. Springer, 2014.
  12. Kristoffer H Hellton and Nils Lid Hjort. Fridge: Focused finetuning of ridge regression for personalized predictions. Statistics in medicine, 37(8):1290–1303, 2018.
  13. Masayuki Hirukawa and Artem Prokhorov. Consistent estimation of linear regression models using matched data. Journal of Econometrics, 203(2):344–358, 2018.
  14. Achin Jain, Francesco Smarra, Madhur Behl, and Rahul Mangharam. Data-driven model predictive control with regression treesan application to building energy management. ACM Transactions on Cyber-Physical Systems, 2(1):4, 2018.
  15. Aman Mohammad Kalteh. Monthly river flow forecasting using artificial neural network and support vector regression models coupled with wavelet transform. Computers & Geosciences, 54:1–8, 2013.
  16. Zhen Lei and Stan Z Li. Coupled spectral regression for matching heterogeneous faces. In Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on, pages 1123–1128. IEEE, 2009.
  17. Yantao Li, Hailong Hu, Gang Zhou, and Shaojiang Deng. Sensor-based continuous authentication using cost-effective kernel ridge regression. IEEE Access, 2018.
  18. Jiajun Liu, Shuo Shang, Kai Zheng, and Ji-Rong Wen. Multiview ensemble learning for dementia diagnosis from neuroimaging: an artificial neural network approach. Neurocomputing, 195:112–116, 2016.
  19. Sujith Mangalathu, Jong-Su Jeon, and Reginald DesRoches. Critical uncertainty parameters influencing seismic performance of bridges using lasso regression. Earthquake Engineering & Structural Dynamics, 47(3):784–801, 2018.
  20. Santosh Singh Rathore and Sandeep Kumar. A decision tree regression based approach for the number of software faults prediction. ACM SIGSOFT Software Engineering Notes, 41(1):1–6, 2016.
  21. Celine B Santiago, Jing-Yao Guo, and Matthew S Sigman. Predictive and mechanistic multivariate linear regression models for reaction development. Chemical science, 9(9):2398–2412, 2018.
  22. Bernhard Sch¨olkopf, Alexander J Smola, Francis Bach, et al. Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2002.
  23. Ingo Steinwart, Don Hush, and Clint Scovel. An explicit description of the reproducing kernel hilbert spaces of gaussian rbf kernels. IEEE Transactions on Information Theory, 52(10):4635–4643, 2006.
  24. Shengzheng Wang, Baoxian Ji, Jiansen Zhao, Wei Liu, and Tie Xu. Predicting ship fuel consumption based on lasso regression. Transportation Research Part D: Transport and Environment, 2017.
  25. YaozhengWang, Dawei Feng, Dongsheng Li, Xinyuan Chen, Yunxiang Zhao, and Xin Niu. A mobile recommendation system based on logistic regression and gradient boosting decision trees. In IJCNN, pages 1896–1902, 2016.
  26. Hongyan Wu, Yunpeng Cai, Yongsheng Wu, Ren Zhong, Qi Li, Jing Zheng, Denan Lin, and Ye Li. Time series analysis of weekly influenza-like illness rate using a one-year period of factors in random forest regression. Bioscience trends, 11(3):292–296, 2017.
  27. Junbo Zhang, Zejing Wang, Xiangtian Zheng, Lin Guan, and CY Chung. Locally weighted ridge regression for power system online sensitivity identification considering data collinearity. IEEE Transactions on Power Systems, 33(2):1624–1634, 2018.
Index Terms

Computer Science
Information Sciences

Keywords

Ensemble regression Multi-kernel learning Kernel regression