Improving Localized Multiple Kernel Learning via Radius-Margin BoundReport as inadecuate

Improving Localized Multiple Kernel Learning via Radius-Margin Bound - Download this document for free, or read online. Document in PDF available to download.

Mathematical Problems in Engineering - Volume 2017 2017, Article ID 4579214, 12 pages -

Research Article

School of Computer and Software Engineering, Xihua University, Chengdu 610039, China

Robotics Research Center, Xihua University, Chengdu 610039, China

Correspondence should be addressed to Xiaoming Wang

Received 28 July 2016; Accepted 21 November 2016; Published 9 January 2017

Academic Editor: Wanquan Liu

Copyright © 2017 Xiaoming Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Localized multiple kernel learning LMKL is an effective method of multiple kernel learning MKL. It tries to learn the optimal kernel from a set of predefined basic kernels by directly using the maximum margin principle, which is embodied in support vector machine SVM. However, LMKL does not consider the radius of minimum enclosing ball MEB which actually impacts the error bound of SVM as well as the separating margin. In the paper, we propose an improved version of LMKL, which is named ILMKL. The proposed method explicitly takes into consideration both the margin and the radius and so achieves better performance over its counterpart. Moreover, the proposed method can automatically tune the regularization parameter when learning the optimal kernel. Consequently, it avoids using the time-consuming cross-validation process to choose the parameter. Comprehensive experiments are conducted and the results well demonstrate the effectiveness and efficiency of the proposed method.

Author: Xiaoming Wang, Zengxi Huang, and Yajun Du



Related documents