摘要
机器学习中各类别样本数目不等是普遍存在且备受关注的不均衡问题。广泛用于特征选择的信息增益IG(information gain)算法,在这类不均衡问题中的表现却极少被研究。本文在讨论IG算法在不同均衡度数据集上性能的基础上,提出了一种新的解决不均衡问题的特征选择算法Im-IG(imbalanced-information gain)。Im-IG通过提高小类分布在信息熵计算中的权重,优先选入有利于小类正确分离的特征。在提升整体分类性能的同时,着眼于提高小类的正确率。在多个不均衡数据集上的实验结果表明,Im-IG算法能较好地解决IG算法在不均衡问题中的不适应性,是一种有效的不均衡问题特征选择算法。
An imbalanced data set is an ubiquitous problem in the machine learning field,which attracts much attention from related scientists.The information gain(IG)method is widely used in feature selection,but it is seldom studied in imbalanced problem.Based on the performance discussion of IG on imbalanced data sets,a new method Im-IG was proposed for the imbalanced problem in feature selection.Im-IG increased the weight of minor class in the entropy calculation,in order to select features which were better for minor class.Im-IG focused on improving the classification accuracy of minor class,based on the performance improvement of the whole data set.Experimental results on several imbalanced data sets showed that Im-IG can solve the imbalanced predicament IG met and it was an effective feature selection method for the imbalanced problem.
出处
《山东大学学报(工学版)》
CAS
北大核心
2010年第5期123-128,共6页
Journal of Shandong University(Engineering Science)
基金
国家自然科学基金资助项目(60873129
30901897)
上海市青年科技启明星计划资助项目(08QA1403200)