Confusionmatrix: Difference between revisions

From Eigenvector Research Documentation Wiki
Jump to navigation Jump to search
imported>Donal
imported>Donal
No edit summary
Line 28: Line 28:
Four additional fields are also shown for each class:
Four additional fields are also shown for each class:
  N:  Number of samples belonging to each class
  N:  Number of samples belonging to each class
  Err: Misclassification error = proportion of samples which were incorrectly classified, = 1-accuracy, = (FP+FN)/(TP+TN+FP+FN)
  Err: Misclassification error = proportion of samples which were incorrectly classified,  
        = 1-accuracy, = (FP+FN)/(TP+TN+FP+FN)
  P:  Precision, = TP/(TP+FP)
  P:  Precision, = TP/(TP+FP)
  F1:  F1 Score, = 2*TP/(2*TP+FP+FN)
  F1:  F1 Score, = 2*TP/(2*TP+FP+FN)

Revision as of 08:34, 3 March 2017

Purpose

Create a confusion matrix showing classification rates from a classification model or from a list of actual classes and a list of predicted classes.

Synopsis

[misclassed, classids, texttable] = confusionmatrix(model);  % create confusion matrix from classifier model
[misclassed, classids, texttable] = confusionmatrix(model, usecv);  % create confusion matrix from model using CV results
[misclassed, classids, texttable] = confusionmatrix(trueClass, predClass); % create confusion matrix from vectors of true and pred classes

Description

Confusionmatrix creates a table of results showing True Positive, False Postive, True Negative and False Negative rates (TPR FPR TNR FNR) as a matrix for each class modeled in an input model. The 'most probable' predicted class is used when a model is input. Input models must be of type PLSDA, SVMDA, KNN, or SIMCA.

Optional second parameter "usecv" specifies use of the cross-validation based "model.detail.cvmisclassification" instead of the default self-prediction classifications "model.classification".

Input can consist of vectors of true class and predicted class instead of a model.

Classification rates are defined as:

TPR: proportion of positive cases that were correctly identified (Sensitivity), = TP/(TP+FN)
FPR: proportion of negatives cases that were incorrectly classified as positive, = FP/(FP+TN)
TNR: proportion of negatives cases that were classified correctly (Specificity), = TN/(TN+FP)
FNR: proportion of positive cases that were incorrectly classified as negative, = FN/(FN+TP)

Four additional fields are also shown for each class:

N:   Number of samples belonging to each class
Err: Misclassification error = proportion of samples which were incorrectly classified, 
        = 1-accuracy, = (FP+FN)/(TP+TN+FP+FN)
P:   Precision, = TP/(TP+FP)
F1:  F1 Score, = 2*TP/(2*TP+FP+FN)

where TP/TN/FP/FN refer to the counts rather than the rates for these quantities.


Inputs

  • model = previously generated classifier model or pred structure,
  • usecv = 0 or 1. 0 indicates confusion matrix should be based on self-prediction results, 1 indicates it is based on using cross-validation results (assuming they are available in the model),
  • trueClass = vector of numeric values indicating the true sample classes,
  • predClass = vector of numeric values indicating the predicted sample classes.

Outputs

  • misclassed = confusion matrix, nclasses x 4 array, one row per class, columns are True/False Postive/Negative rates (TP FP TN FN),
  • classids = class names (identifiers),
  • texttable = cell array containing a text representation of the confusion matrix. The i-th element of the cell array, texttable{i}, is the i-th line of the texttable. If there are only two classes then the Matthew's Correlation Coefficient value is included as the last line. Note that this text representation of the confusion matrix is displayed if the function is called with no output assignment.

Example

Calling confusionmatrix with no output variables assigned: 'confusionmatrix(model)' displays the output:

>> confusionmatrix(model) Confusion Matrix:

   Class:      TPR         FPR         TNR         FNR         N      Err         P           F1     
       K       0.58824     0.09091     0.90909     0.41176     17     0.20000     0.76923     0.66667
       BL      0.83333     0.11364     0.88636     0.16667      6     0.12000     0.50000     0.62500
       SH      0.72727     0.10256     0.89744     0.27273     11     0.14000     0.66667     0.69565
       AN      0.87500     0.02941     0.97059     0.12500     16     0.06000     0.93333     0.90323

See Also

confusiontable, plsda, svmda, knn, simca