Keras:每个epoch的混淆矩阵

6

我可以使用Keras Callback在每个epoch记录损失,参考这个。有没有办法计算混淆矩阵并将其用作度量标准?

更新 我尝试定义以下函数来返回混淆矩阵,但仍然无法正常工作。

def con_mat(y_true,y_pred):
    total_correct_true = K.sum(K.round(K.clip(y_true*y_pred,0,1)))
    total_true = K.sum(y_true)
    predicted_true = K.sum(K.round(y_pred))
    return (total_correct_true)/(total_true+predicted_true)

我认为True positive的逻辑是正确的,但它并未按预期工作。


你可以实现自己的回调函数。 - Eli Korvigo
你解决了这个问题吗?我也遇到了同样的问题,想知道你是否已经解决了它。 - W. Sam
1个回答

1

简单来说,只需将以下函数传递给model.compile函数:

from keras import backend as K

def recall_m(y_true, y_pred): # TPR
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1))) # TP
    possible_positives = K.sum(K.round(K.clip(y_true, 0, 1))) # P
    recall = true_positives / (possible_positives + K.epsilon())
    return recall

def precision_m(y_true, y_pred):
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1))) # TP
    predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1))) # TP + FP
    precision = true_positives / (predicted_positives + K.epsilon())
    return precision

def f1_m(y_true, y_pred):
    precision = precision_m(y_true, y_pred)
    recall = recall_m(y_true, y_pred)
    return 2*((precision*recall)/(precision+recall+K.epsilon()))

def TP(y_true, y_pred):
    tp = K.sum(K.round(K.clip(y_true * y_pred, 0, 1))) # TP
    y_pos = K.round(K.clip(y_true, 0, 1))
    n_pos = K.sum(y_pos)
    y_neg = 1 - y_pos
    n_neg = K.sum(y_neg)
    n = n_pos + n_neg
    return tp/n

def TN(y_true, y_pred):
    y_pos = K.round(K.clip(y_true, 0, 1))
    n_pos = K.sum(y_pos)
    y_neg = 1 - y_pos
    n_neg = K.sum(y_neg)
    n = n_pos + n_neg
    y_pred_pos = K.round(K.clip(y_pred, 0, 1))
    y_pred_neg = 1 - y_pred_pos
    tn = K.sum(K.round(K.clip(y_neg * y_pred_neg, 0, 1))) # TN
    return tn/n

def FP(y_true, y_pred):
    y_pos = K.round(K.clip(y_true, 0, 1))
    n_pos = K.sum(y_pos)
    y_neg = 1 - y_pos
    n_neg = K.sum(y_neg)
    n = n_pos + n_neg
    tn = K.sum(K.round(K.clip(y_neg * y_pred, 0, 1))) # FP
    return tn/n

def FN(y_true, y_pred):
    y_pos = K.round(K.clip(y_true, 0, 1))
    n_pos = K.sum(y_pos)
    y_neg = 1 - y_pos
    n_neg = K.sum(y_neg)
    n = n_pos + n_neg
    y_pred_pos = K.round(K.clip(y_pred, 0, 1))
    y_pred_neg = 1 - y_pred_pos
    tn = K.sum(K.round(K.clip(y_true * y_pred_neg, 0, 1))) # FN
    return tn/n

然后,
model.compile(loss='binary_crossentropy',
          optimizer=optimizers.RMSprop(lr=lr),
          metrics=['accuracy',f1_m,precision_m, recall_m, TP, TN, FP, FN])

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接