doi: 10.4304/jsw.9.6.1494-1502
Building a Biased Least Squares Support Vector Machine Classifier for Positive and Unlabeled Learning
Abstract—Learning from positive and unlabeled examples (PU learning) is a special case of semi-supervised binary classification. The key feature of PU learning is that there is no labeled negative training data, which makes the traditional classification techniques inapplicable. Similar to the idea of Biased-SVM which is one of the most famous classifier, a biased least squares support vector machine classifier (Biased-LSSVM) is proposed for PU learning in this paper. More specifically, we take unlabeled examples as negative examples with noise and build a least squares support vector machine classifier using two penalty parameters Cp and Cn to weight misclassification errors of positive and negative examples respectively. As we pay more attention to classify as many as positive examples correctly in PU learning, the relationship of parameters Cp and Cnis Cp ≥Cn . Compared with Biased-SVM, the proposed classifier has three advantages. First, Biased-LSSVM can reflect the class labels of all examples more sufficiently and accurately than Biased-SVM. Second, Biased-LSSVM is more stable than Biased-SVM because the performance of Biased-LSSVM changes less than that of Biased-SVM over a wide ratio of positive examples in unlabeled examples. Finally, the time complexity of Biased-LSSVM is lower than that of Biased-SVM, where Biased-LSSVM only need to solve liner equations and Biased-SVM is a quadratic programming. The Experiments on two real applications, text classification and bioinformatics classification verify the above opinions and show that Biased-LSSVM is more effective than Biased-SVM and other popular methods, such as EB-SVM, ROC-SVM and S-EM.
Index Terms—positive and unlabeled learning, least squares support vector machine, text classification, bioinformatics classification
Cite: Ting Ke, Lujia Song, Bing Yang, Xinbin Zhao, Ling Jing, "Building a Biased Least Squares Support Vector Machine Classifier for Positive and Unlabeled Learning," Journal of Software vol. 9, no. 6, pp. 1494-1502, 2014.
General Information
ISSN: 1796-217X (Online)
Abbreviated Title: J. Softw.
Frequency: Quarterly
APC: 500USD
DOI: 10.17706/JSW
Editor-in-Chief: Prof. Antanas Verikas
Executive Editor: Ms. Cecilia Xie
Abstracting/ Indexing: DBLP, EBSCO,
CNKI, Google Scholar, ProQuest,
INSPEC(IET), ULRICH's Periodicals
Directory, WorldCat, etcE-mail: jsweditorialoffice@gmail.com
-
Jun 12, 2024 News!
Vol 19, No 2 has been published with online version [Click]
-
Jan 04, 2024 News!
JSW will adopt Article-by-Article Work Flow
-
Apr 01, 2024 News!
Vol 14, No 4- Vol 14, No 12 has been indexed by IET-(Inspec) [Click]
-
Apr 01, 2024 News!
Papers published in JSW Vol 18, No 1- Vol 18, No 6 have been indexed by DBLP [Click]
-
Mar 01, 2024 News!
Vol 19, No 1 has been published with online version [Click]