Low-rank matrix decomposition and kernel learning are two useful techniques in building advanced learning systems. Low-rank decomposition can greatly reduce the computational cost of manipulating large kernel matrices. However, existing approaches are mostly unsupervised and do not incorporate side information such as class labels, making the decomposition less effective for a specific learning task. On the other hand, kernel learning techniques aim at constructing kernel matrices whose structure is well aligned with the learning target, which improves the generalization performance of kernel methods. However, most kernel learning approaches are computationally very expensive. To obtain the advantages of both techniques and address their limitations, in this paper we propose a novel kernel low-rank decomposition formulation called the generalized Nyström method. Our approach inherits the linear time and space complexity via matrix decomposition, while at the same time fully exploits (partial) label information in computing task-dependent decomposition. In addition, the resultant low-rank factors can generalize to arbitrary new samples, rendering great flexibility in inductive learning scenarios. We further extend the algorithm to a multiple kernel learning setup. The experimental results on semi-supervised classification demonstrate the usefulness of the proposed method.
- Kernel learning;
- Nyström low-rank decomposition;
- Large-scale learning algorithms;
- Multiple kernel learning
© 2017 Elsevier B.V. All rights reserved.
Note to users:
Accepted manuscripts are Articles in Press that have been peer reviewed and accepted for publication by the Editorial Board of this publication. They have not yet been copy edited and/or formatted in the publication house style, and may not yet have the full ScienceDirect functionality, e.g., supplementary files may still need to be added, links to references may not resolve yet etc. The text could still change before final publication.
Although accepted manuscripts do not have all bibliographic details available yet, they can already be cited using the year of online publication and the DOI, as follows: author(s), article title, Publication (year), DOI. Please consult the journal’s reference style for the exact appearance of these elements, abbreviation of journal names and use of punctuation.
When the final article is assigned to volumes/issues of the Publication, the Article in Press version will be removed and the final version will appear in the associated published volumes/issues of the Publication. The date the article was first made available online will be carried over.
Author : Liang Lan, Kai Zhang, Hancheng Ge, Wei Cheng, Jun Liu, Andreas Rauber, Xiao-Li Li, Jun Wang, Hongyuan Zha
from ScienceDirect Publication: Artificial Intelligence http://ift.tt/2pHrTmi