JSW 2019 Vol.14(2): 65-91 ISSN: 1796-217X
doi: 10.17706/jsw.14.2.65-91
doi: 10.17706/jsw.14.2.65-91
PLAC: Partitioning Based Lazy Classification
Wei Song1*, He Jiang2, Fan Ma3, Qinbao Song3, Guangtao Wang3
1Department of Electrical Engineering and Computer Sciences, University of California at Berkeley, USA
2 School of Software Technology, Dalian University of Technology, China
3Department of Computer Science and Technology, Xi’an Jiaotong University, China
Abstract— Traditional classification methods cannot well capture the characteristics of complex problems, thus leading to poor performance. In this paper, we propose a new framework named Partition based LAzy Classification (PLAC) tobetter characterize complex problems by dividing the training data space into smaller and easier-to-learn partitions. In PLAC, only the nearest partition of a new instance is used to train a local classifier that is finally used to classify the new instance. As the partitioning is performed based on information gain before receiving a new instance, the resulting partitions are groups of similar instances and the chance of the nearest instances of the new instance coming from different regions by accident isreduced. Moreover, our method uses only one partition to conducta prediction and employs the caching mechanism to avoid work replication during classification, thus efficiency is improved. An extensive experimental evaluation on 40 real world data sets shows that PLAC effectively improves the performance of base classifiers and outperforms existing mainstream ensemble methods.
Index Terms— Classification, eager learning, lazy learning, data partitioning, ensemble learning.
2 School of Software Technology, Dalian University of Technology, China
3Department of Computer Science and Technology, Xi’an Jiaotong University, China
Abstract— Traditional classification methods cannot well capture the characteristics of complex problems, thus leading to poor performance. In this paper, we propose a new framework named Partition based LAzy Classification (PLAC) tobetter characterize complex problems by dividing the training data space into smaller and easier-to-learn partitions. In PLAC, only the nearest partition of a new instance is used to train a local classifier that is finally used to classify the new instance. As the partitioning is performed based on information gain before receiving a new instance, the resulting partitions are groups of similar instances and the chance of the nearest instances of the new instance coming from different regions by accident isreduced. Moreover, our method uses only one partition to conducta prediction and employs the caching mechanism to avoid work replication during classification, thus efficiency is improved. An extensive experimental evaluation on 40 real world data sets shows that PLAC effectively improves the performance of base classifiers and outperforms existing mainstream ensemble methods.
Index Terms— Classification, eager learning, lazy learning, data partitioning, ensemble learning.
Cite: Wei Song, He Jiang, Fan Ma, Qinbao Song, Guangtao Wang, "PLAC: Partitioning Based Lazy Classification," Journal of Software vol. 14, no. 2, pp. 65-91, 2019.
General Information
ISSN: 1796-217X (Online)
Frequency: Quarterly
Editor-in-Chief: Prof. Antanas Verikas
Executive Editor: Ms. Yoyo Y. Zhou
Abstracting/ Indexing: DBLP, EBSCO, CNKI, Google Scholar, ProQuest, INSPEC(IET), ULRICH's Periodicals Directory, WorldCat, etc
E-mail: jsw@iap.org
-
Apr 26, 2021 News!
Vol 14, No 4- Vol 14, No 12 has been indexed by IET-(Inspec) [Click]
-
Nov 18, 2021 News!
Papers published in JSW Vol 16, No 1- Vol 16, No 6 have been indexed by DBLP [Click]
-
Dec 24, 2021 News!
Vol 15, No 1- Vol 15, No 6 has been indexed by IET-(Inspec) [Click]
-
Nov 18, 2021 News!
[CFP] 2022 the annual meeting of JSW Editorial Board, ICCSM 2022, will be held in Rome, Italy, July 21-23, 2022 [Click]
-
May 04, 2023 News!
Vol 18, No 2 has been published with online version [Click]