[1] |
ZHOU T , LU H L , WANG W W ,et al. GA-SVM based feature selection and parameter optimization in hospitalization expense modeling[J]. Applied Soft Computing, 2019(75): 323-332.
|
[2] |
LI J D , CHENG K W , WANG S H ,et al. Feature selection:a data perspective[J]. ACM Computing Surveys, 2017,50(6): 1-45.
|
[3] |
周志华 . 机器学习[M]. 北京: 清华大学出版社, 2016.
|
|
ZHOU Z H . Machine learning[M]. Beijing: Tsinghua University PressPress, 2016.
|
[4] |
LIU H , YU L . Toward integrating feature selection algorithms for classification and clustering[J]. IEEE Transactions on Knowledge and Data Engineering, 2005,17(4): 491-502.
|
[5] |
ALMUALLIM H , DIETTERICH T G . Learning boolean concepts in the presence of many irrelevant features[J]. Artificial Intelligence, 1994,69(1-2): 279-305.
|
[6] |
KAMATH U , DE J K , SHEHU A . Effective automated feature construction and selection for classification of biological sequences[J]. Plos One, 2014,9(7):e99982
|
[7] |
GUYON I , ELISSEEFF A . An introduction to variable and feature selection[J]. Journal of Machine Learning Research, 2003,3(6): 1157-1182.
|
[8] |
ZAKERI A , HOKMABADI A . Efficient feature selection method using real-valued grasshopper optimization algorithm[J]. Expert Systems with Applications, 2019(119): 61-72.
|
[9] |
XUE B , ZHANG M , BROWNE W N ,et al. A survey on evolutionary computation approaches to feature selection[J]. IEEE Transactions on Evolutionary Computation, 2016,20(4): 606-626.
|
[10] |
GHAEMI M , FEIZI-DERAKHSHI M R . Feature selection using forest optimization algorithm[J]. Pattern Recognition, 2016(60): 121-129.
|
[11] |
ZHANG Y , SONG X , GONG D . A return-cost-based binary firefly algorithm for feature selection[J]. Information Sciences, 2017,418-419: 561-574.
|
[12] |
CHEN T , GUESTRIN C . XGBoost:a scalable tree boosting system[C]// ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2016: 785-794.
|
[13] |
FRIEDMAN J , HASTIE T , TIBSHIRANI R . Special invited paper.additive logistic regression:a statistical view of boosting[J]. The Annals of Statistics, 2000,28(2): 337-374.
|
[14] |
LU Y , LIU L , LUAN S ,et al. The diagnostic value of texture analysis in predicting WHO grades of meningiomas based on ADC maps:an attempt using decision tree and decision forest[J]. European Radiology, 2019,29(3): 1318-1328.
|
[15] |
PANG L , WANG J , ZHAO L ,et al. A novel protein subcellular localization method with CNN-XGBoost model for alzheimer’s disease[J]. Frontiers in Genetics, 2019(9): 1-7.
|
[16] |
PAN B . Application of XGBoost algorithm in hourly PM2.5 concentration prediction[J]. IOP Conference Series:Earth and Environmental Science, 2018(113): 012-127.
|
[17] |
MACEDO F , ROSáRIO OLIVEIRA M , PACHECO A ,et al. Theoretical foundations of forward feature selection methods based on mutual information[J]. Neurocomputing, 2019(325): 67-89.
|
[18] |
VERGARA J R , ESTéVEZ P A . A review of feature selection methods based on mutual information[J]. Neural Computing and Applications, 2014,24(1): 175-186.
|
[19] |
SHI F , YAO Y , BIN Y ,et al. Computational identification of deleterious synonymous variants in human genomes using a feature-based approach[J]. BMC Medical Genomics, 2019,12(1): 81-88.
|
[20] |
XUE B , ZHANG M , BROWNE W N . Novel initialisation and updating mechanisms in PSO for feature selection in classification[J]. Applications of Evolutionary Computation, 2013(7835): 428-438.
|
[21] |
GHOSH A , DATTA A , GHOSH S . Self-adaptive differential evolution for feature selection in hyperspectral image data[J]. Applied Soft Computing, 2013,13(4): 1969-1977.
|
[22] |
DUA D , EFI K T . UCI machine learning repository[J]. The UCI Machine Learning Repository, 2019.
|
[23] |
BROWN G , POCOCK A C , ZHAO M J ,et al. Conditional likelihood maximisation:a unifying framework for information theoretic feature selection[J]. Journal of Machine Learning Research, 2012(13): 27-66.
|
[24] |
CADENAS J M , GARRIDO M C , MARTíNEZ R . Feature subset selection filter–wrapper based on low quality data[J]. Expert Systems with Applications, 2013,40(16): 6241-6252.
|
[25] |
MAFARJA M M , MIRJALILI S . Hybrid whale optimization algorithm with simulated annealing for feature selection[J]. Neurocomputing, 2017(260): 302-312.
|
[26] |
EMARY E , ZAWBAA H M , HASSANIEN A E . Binary grey wolf optimization approaches for feature selection[J]. Neurocomputing, 2016(172): 371-381.
|