A confusion matrix (Kohavi and Provost, 1998) contains information about actual and predicted classifications done by a classification system. Performance of such systems is commonly evaluated using the data in the matrix. The following table shows the confusion matrix for a two class classifier.
The entries in the confusion matrix have the following meaning in the context of our study:
- a is the number of correct predictions that an instance is negative,
- b is the number of incorrect predictions that an instance is positive,
- c is the number of incorrect of predictions that an instance negative, and
- d is the number of correct predictions that an instance is positive.
Predicted | |||
Negative | Positive | ||
Actual | Negative | a | b |
Positive | c | d |
Several standard terms have been defined for the 2 class matrix:
- The accuracy (AC) is the proportion of the total number of predictions that were correct. It is determined using the equation:
[1]
- The recall or true positive rate (TP) is the proportion of positive cases that were correctly identified, as calculated using the equation:
[2]
- The false positive rate (FP) is the proportion of negatives cases that were incorrectly classified as positive, as calculated usingthe equation:
[3]
- The true negative rate (TN) is defined as the proportion of negatives cases that were classified correctly, as calculated using the equation:
[4]
- The false negative rate (FN) is the proportion of positives cases that were incorrectly classified as negative, as calculated using the equation:
[5]
- Finally, precision (P) is the proportion of the predicted positive cases that were correct, as calculated using the equation:
[6]
The accuracy determined using equation 1 may not be an adequate performance measure when the number of negative cases is much greater than the number of positive cases (Kubat et al., 1998). Suppose there are 1000 cases, 995 of which are negative cases and 5 of which are positive cases. If the system classifies them all as negative, the accuracy would be 99.5%, even though the classifier missed all positive cases. Other performance measures account for this by including TP in a product: for example, geometric mean (g-mean) (Kubat et al., 1998), as defined in equations 7 and 8, and F-Measure (Lewis and Gale, 1994), as defined in equation 9.
[7]
[8]
[9]
In equation 9, b has a value from 0 to infinity and is used to control the weight assigned to TP and P. Any classifier evaluated using equations 7, 8 or 9 will have a measure value of 0, if all positive cases are classified incorrectly.
相关推荐
Confusion Matrix in Python plot a pretty confusion matrix (like Matlab) in python using seaborn and matplotlib.zip
在 MATLAB 开发中,ConfusionMatrix 是一个非常重要的工具,特别是在机器学习和模式识别领域。混淆矩阵(Confusion Matrix)是一种评估分类模型性能的方法,它能够直观地展示预测结果与真实结果之间的对应关系。本篇...
在数据分析和机器学习领域,评估模型性能是至关重要的一步,其中混淆矩阵(Confusion Matrix)、准确率(Accuracy)、召回率(Recall)、精准率(Precision)以及ROC曲线都是常用的评估工具。这些概念对于理解模型...
Matlab code for computing and visualization: Confusion Matrix, Precision/Recall, ROC, Accuracy, F-Measure etc. for Classification. Matlab通过分类的label计算混淆矩阵Confusion Matrix并且显示的函数只要...
其中,混淆矩阵(Confusion Matrix)是一种常用的工具,用于直观地分析分类模型的预测效果。在这个场景中,我们关注的是基于ResNet34网络结构的模型在CIFAR10数据集上的表现。CIFAR10是一个经典的图像识别数据集,包含...
常用的matlab机器学习中的confusion matrix的计算和绘制。 输入为预测标签和真实标签。可用于二分类,多分类等任务中。 不需要matlab额外的toolbox,即插即用,方便快捷,代码注释详细, 一读就懂,就会使用。
confusion matrix使用MATLAB绘制多分类的混淆矩阵图,可自定义横纵坐标、字体、渐变颜色等,适用于深度学习、机器学习中多分类任务的结果分析混淆矩阵图。
标题中的"confusion_matrix1.zip_confusion_matrix1_matlab混淆矩阵_混淆矩阵_混淆矩阵matlab 画混淆矩阵"表明这是一个使用MATLAB编写的程序,用于绘制和理解混淆矩阵。MATLAB是一种广泛使用的数值计算和数据可视化...
混淆矩阵(Confusion Matrix)是机器学习领域中一个非常重要的工具,尤其在监督学习任务中,用来评估分类模型的性能。它通过一个N×N的矩阵来表示,这里的N即为分类问题中类别的数量。在混淆矩阵中,每一行代表实际...
matrix_confusion whit k_fold
在MATLAB中,`confusion_matrix.m`是一个用于生成混淆矩阵的函数。这个源码文件可能是实现这一功能的自定义脚本。源码可能包含了计算混淆矩阵、打印和可视化混淆矩阵的步骤。 混淆矩阵通常由四个主要区域组成: 1....
分类准确度衡量之混淆矩阵
:hammer_and_wrench: 如何安装运行以下命令: npm install confusion-matrix-stats :woman::laptop: 如何使用它您只需要创建一个新的Confusion Matrix实例: const confusionMatrix = new ConfusionMatrix ( { ...
ConfusionMatrix类可用于为对象检测任务生成混淆矩阵。 用法 在测试代码中,您需要使用适当的参数声明ConfusionMatrix类。 conf_mat = ConfusionMatrix(num_classes = 3, CONF_THRESHOLD = 0.3, IOU_THRESHOLD...
混淆矩阵
int totalCases = confusionMatrix[0, 0] + confusionMatrix[0, 1] + confusionMatrix[1, 0] + confusionMatrix[1, 1]; double accuracy = (double)(confusionMatrix[0, 0] + confusionMatrix[1, 1]) / totalCases; ...
在scikit-learn库中,可以直接使用`sklearn.metrics.confusion_matrix`计算混淆矩阵,而tensorflow中可以通过`tf.math.confusion_matrix`实现。这些库还提供了其他评估指标,如准确率、精确率、召回率和F1分数,它们...
首先,混淆矩阵(Confusion Matrix)是一个二维表格,用于展示分类模型预测结果与实际类别之间的对比。在二分类问题中,它通常有四个关键指标:真正例(True Positives, TP)、假正例(False Positives, FP)、真...
官方例子,深度学习专用,机器学习专用,代码简单,一看就会(dlcv make confusion matrix demo)
- `confusionMatrix.kappa`:计算Kappa系数,这是一种评估分类精度的方法,考虑了分类器的随机误差。 - `confusionMatrix.accuracy`:计算总体准确率,即正确分类的比例。 - `confusionMatrix.producersAccuracy`...