我有一个不平衡的数据集(6%为正例),我使用了caret包中的xgboost模型。
以下是我的代码:
gbmGrid <- expand.grid(nrounds = 50,
eta = 0.4,
max_depth = 2,
gamma = 0,
colsample_bytree=0.8,
min_child_weight=1,
subsample=1)
ctrl <- trainControl(method = "cv",
number = 10,
search = "grid",
fixedWindow = TRUE,
verboseIter = TRUE,
returnData = TRUE,
returnResamp = "final",
savePredictions = "all",
classProbs = TRUE,
summaryFunction = twoClassSummary,
sampling = "smote",
selectionFunction = "best",
trim = FALSE,
allowParallel = TRUE)
classifier <- train(x = training_set[,-1],y = training_set[,1], method = 'xgbTree',metric = "ROC",trControl = ctrl,tuneGrid = gbmGrid)
问题在于每次“运行”训练线路时,都会得到不同的ROC、灵敏度和特异性。
ROC Sens Spec
0.696084 0.8947368 0.2736111
ROC Sens Spec
0.6655806 0.8917293 0.2444444
** expand.grid 设置在最佳调节模型上。
有人知道为什么模型不稳定吗?
set.seed()
。 :) - Vivek Kumar