Abstract

Existing supervised learning techniques can support product recommendations but are ineffective in scenarios characterized by single-class learning; i.e., training samples consisted of some positive examples and a much greater number of unlabeled examples. To address the limitations inherent in existing single-class learning techniques, we develop COst-sensitive Learning-based Positive Example Learning (COLPEL), which constructs an automated classifier from a singleclass training sample. Our method employs cost-proportionate rejection sampling to derive, from unlabeled examples, a subset likely to feature negative examples, according to the respective misclassification costs. COLPEL follows a committee machine strategy, thereby constructing a set of automated classifiers used together to reduce probable biases common to a single classifier. We use customers’ book ratings from the Amazon.com Web site to evaluate COLPEL, with PNB and PEBL as benchmarks. Our results show that COLPEL outperforms both PNB and PEBL, as measured by its accuracy, positive F1 score, and negative F1 score.

Share

COinS