期刊文献+

基于多模型融合的流失用户预测方法 被引量:5

A user churn prediction method based on multi-model fusion
下载PDF
导出
摘要 准确的用户流失预测能力有助于企业提高用户保持率、增加用户数量和增加盈利。现有的流失用户预测模型大多为单一模型或是多个模型的简单融合,没有充分发挥多模型集成的优势。借鉴了随机森林的Bootstrap Sampling的思想,提出了一种改进的Stacking集成方法,并将该方法应用到了真实数据集上进行流失用户的预测。通过验证集上的实验比较可知,提出的方法在流失用户F1值、召回率和预测准确率3项指标上均好于所有相同结构的经典Stacking集成方法;当采用恰当的集成结构时,其表现可超越基分类器上的最优表现。 Accurate user churn prediction ability facilitates improving user retention rate,increasing user count and increasing profitability.Most of the existing user churn prediction models are single model or simple integration of multiple models,and the advantages of multi-model integration are not fully utilized.This paper draws on the idea of Bootstrap Sampling in random forests,proposes an improved Stacking ensemble method,and applies the method to the real data set to predict the user churn.Through the experimental comparison on the validation set,the proposed method is better than the classical Stacking ensemble method with the same structure in the terms of the F1-score,recall rate and prediction accuracy of user churn.When the appropriate structure is adopted,the performance can surpass the optimal performance on the base classifier.
作者 叶成 郑红 程云辉 YE Cheng;ZHENG Hong;CHENG Yun-hui(School of Information Science and Engineering,East China University of Science and Technology,Shanghai 200237,China)
出处 《计算机工程与科学》 CSCD 北大核心 2019年第11期2027-2032,共6页 Computer Engineering & Science
基金 国家自然科学基金(61103115,61103172) 上海市科委科技创新行动计划高新技术领域项目(16511101000)
关键词 Stacking集成学习 用户流失预测 BOOTSTRAP Sampling 机器学习 Stacking ensemble learning user churn prediction Bootstrap Sampling machine learning
  • 相关文献

参考文献4

二级参考文献62

  • 1桂现才,彭宏,王小华.C4.5算法在保险客户流失分析中的应用[J].计算机工程与应用,2005,41(17):197-199. 被引量:33
  • 2莫礼平,樊晓平.BP神经网络在数据挖掘分类中的应用[J].吉首大学学报(自然科学版),2006,27(1):59-62. 被引量:5
  • 3常建龙,曹锋,周傲英+.基于滑动窗口的进化数据流聚类[J].软件学报,2007,18(4):905-918. 被引量:61
  • 4程玉胜,张佑生,胡学钢.基于边界域的知识粗糙熵与粗集粗糙熵[J].系统仿真学报,2007,19(9):2008-2011. 被引量:16
  • 5Li G-Z, Yang J Y. Feature selection for ensemble learning and its application[M]. Machine Learning in Bioinformatics, 2008: 135-155.
  • 6Sheinvald J, Byron Dom, Wayne Niblack. A modelling approach to feature selection[J]. Proc of 10th Int Conf on Pattern Recognition, 1990, 6(1): 535-539.
  • 7Cardie C. Using decision trees to improve case-based learning[C]. Proc of 10th Int Conf on Machine Learning. Amherst, 1993: 25-32.
  • 8Modrzejewski M. Feature selection using rough sets theory[C]. Proc of the European Conf on Machine ,Learning. 1993: 213-226.
  • 9Ding C, Peng H. Minimum redundancy feature selection from microarray gene expression data[J]. J of Bioinformatics and Computational Biology, 2005, 3(2): 185-205.
  • 10Francois Fleuret. Fast binary feature selection with conditional mutual information[J]. J of Machine Learning Research, 2004, 5(10): 1531-1555.

共引文献232

同被引文献19

引证文献5

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部