Regression Analysis Using Core Vector Machine Technique Based on Kernel Function Optimization
Subject Areas : Computer EngineeringBabak Afshin 1 , Mohammad Ebrahim Shiri 2 , Kamran Layeghi 3 , Hamid HajSeyyedJavadi 4
1 - Department of Computer Engineering, North Tehran Branch, Islamic Azad University, Tehran, Iran
2 - Department of Mathematics and Computer Sciences, Amirkabir University of Technology, Tehran, Iran
3 - Department of Computer Engineering, North Tehran Branch, Islamic Azad University, Tehran, Iran
4 - Department of Mathematics and Computer Sciences, Shahed University, Tehran, Iran
Keywords: Kernel function, Core vector regression, Grid algorithm, Parameter selection,
Abstract :
Core vector regression (CVR) is an extension of the core vector machine algorithm for regression estimation by generalizing the minimum bounding ball (MEB) problem. As an estimator, both the kernel function and its parameters can significantly affect the prediction precision of CVR. In this paper, a method to improve CVR performance using pre-processing based on data feature extraction and Grid algorithm is proposed to obtain appropriate parameters values of the main formulation and its kernel function. The CVR estimated mean absolute error rate here is the evaluation criterion of the proposed method that should be minimized. In addition, some benchmark datasets out of different databases were used to evaluate the proposed parameter optimization approach. The obtained numerical results show that the proposed method can reduce the CVR error with an acceptable time and space complexity. Therefore, it is able to deal with very large data and real world regression problems.
[1] V. Vapnik, The Nature of Statistical Learning Theory, New York: Springer Verlag, 1995.
[2] B. Scholkopf and A.J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond: MIT Press, 2002.
[3] IvorW. Tsang James T. Kwok and Pak-Ming Cheung, "Very Large SVM Training using Core Vector Machines", Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics (AISTATS), Barbados, 2005.
[4] E. Alpaydin, Introduction to Machine Learning, 2nd edition: MIT Press, 2010.
[5] Luca Lorenzi, Grégoire Mercier and FaridMelgani," Support vector regression with kernel combination for missing data reconstruction", IEEE Geoscience and Remote Sensing Letters, Vol.10, No.2:367 – 371, 2013.
[6] HosseinShafizadeh-Moghadam, Amin Tayyebi, Mohammad Ahmadlou, Mahmoud Reza Delavar and Mahdi Hasanlou, "Integration of genetic algorithm and multiple kernel support vector regression for modeling urban growth", Journal of Computers, Environment and Urban Systems, Vol. 65: 28–40, 2017.
[7]D. Wu, B. Wang, D. Precup and B. Boulet, "Multiple Kernel Learning-Based Transfer Regression for Electric Load Forecasting," in IEEE Trans on Smart Grid, Vol.11, No.2: 1183-1192, March. 2020.
[8] Ivor W. Tsang, James T. Kwok and Kimo T. Lai, "Core vector regression for very large regression problems," Proceedings of the Twenty-Second International Conference on Machine Learning, p. 913–920, 2005.
[9] B. Gu, J.D. Wang and T. Li, "Ordinal-class core vector machine", Journal Of Computer Science And Technology, Vol. 25, No.4: 699–708, 2010.
[10] X. Gu, F.L. Chung and S. Wang, “Extreme vector machine for fast training on large data,”International Journal of Machine Learning and Cybernetics,Vol.11:33–53,2019.
[11] D.M.J. Tax and R.P.W. Duin, "Support vector data description," Journal of Machine Learning, Vol.54: 45-66, 2004.
[11] I. Tsang, J. Kwok and J. Zurada, “Generalized core vector machines,” IEEE Trans on Neural Networks, Vol. 17, No. 5: 1126-1140, 2006.
[12] J. Shawe-Taylor and N. Cristianini, Kernel Methods for Pattern Analysis: Cambridge University Press, 2004.
[13] C.W. Hsu, C.C. Chang and C.J. Lin, A Practical Guide to Support Vector Classification,Department of Computer Science and Information Engineering, National Taiwan University,2003, Available at: http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf.
[14] G. Li, H. Shen and J.Z. Huang, “Supervised Sparse and Functional Principal Component Analysis,” Journal of Computational and Graphical Statistics, Vol. 25, No. 3: 859-878, 2016.
[15] S. Hettich, C.L. Blake and C.J. Merz, UCI Repository of Machine Learning Databases, Department of Information and Computer Science,University of California, Irvine, 1998.
[16] A. J. Smola and B. Scholkophf, "A tutorial on Support Vector Regression", NeuroCOLT2 Technica Report Series, NC2-TR-1998-03, Oct, 2003.
[17] M.G. Genton, “Classes of Kernels for Machine Learning: A Statistics Perspective,”Journal of Machine Learning Research, Vol. 2: 299-312, 2001.
[18] F.M. Schleif, Matlab implementation of the Core Vector Machine of I. Tsang et al., “Core Vector Machines: Fast SVM Training on Very Large Data Sets,” Published in Journal of Machine Learning Research,http://www.techfak.unibielefeld.de/~fschleif/software.xhtml.
[19]R. CollobertandS. Bengio, “SVMTorch: Support vector machines for large-scale regression problems.,”Journal of Machine Learning Research, Vol. 1: 143–160, 2001.
[20] R. Collobert, S. BengioandY.Bengio, “A parallelmixture of SVMs for very large scale problems,” Journal ofNeural Computation, Vol. 14: 1105–1114, 2002.