BinaryConfusionMatrix Class |
Namespace: FinMath.MachineLearning
[SerializableAttribute] public class BinaryConfusionMatrix
The BinaryConfusionMatrix type exposes the following members.
| Name | Description | |
|---|---|---|
| BinaryConfusionMatrix | Initializes a new instance of the BinaryConfusionMatrix class |
| Name | Description | |
|---|---|---|
| Accuracy |
The ratio between correct predictions and the total number of tests.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| DiagnosticOddsRatio |
The diagnostic odds ratio is a measure of the effectiveness of a test.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| F1 |
The harmonic average of the Precision and Recall.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| FallOut |
The ratio between the number of negative events wrongly categorized as positive and the total number of actual negative events (aka @see FalsePositiveRate).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| FalseDiscoveryRate |
The ratio between the number of negative events wrongly categorized as positive and the total number of positive predictions.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| FalseNegative |
Number of positive values incorrectly predicted as negative (miss).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| FalseNegativeRate |
The ratio between the number of positive events wrongly categorized as negative and the total number of actual positive events (aka @see MissRate).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| FalseOmissionRate |
The ratio between the number of positive events wrongly categorized as negative and the total number of negative predictions.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| FalsePositive |
Number of negative values incorrectly predicted as positive (false alarm).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| FalsePositiveRate |
The ratio between the number of negative events wrongly categorized as positive and the total number of actual negative events (aka @see FallOut).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| MatthewsCorrelationCoefficient |
The Matthews correlation coefficient.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| MissRate |
The ratio between the number of positive events wrongly categorized as negative and the total number of actual positive events (aka @see FalseNegativeRate).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| NegativeLikelihoodRatio |
The probability of miss (@see FalseNegativeRate) divided by the probability of correct negative prediction (@see TrueNegativeRate).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| NegativePredictiveValue |
The proportion of negative predictions that are true negative.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| PositiveLikelihoodRatio |
The probability of detection (@see TruePositiveRate) divided by the probability of false alarm (@see TrueNegativeRate).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| PositivePredictiveValue |
The proportion of positive predictions that are true positive (aka @see Precision).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| Precision |
The proportion of positive predictions that are true positive (aka @see PositivePredictiveValue).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| Recall |
The proportion of actual positives that are correctly identified as such (aka @see TruePositiveRate, @see Sensitivity, or probability of detection).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| Selectivity |
The proportion of actual negatives that are correctly identified as such (aka @see TrueNegativeRate, @see Specificity).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| Sensitivity |
The proportion of actual positives that are correctly identified as such (aka @see TruePositiveRate, @see Recall, or probability of detection).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| Specificity |
The proportion of actual negatives that are correctly identified as such (aka @see TrueNegativeRate, @see Selectivity).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| TrueNegative |
Number of correctly predicted negative values.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| TrueNegativeRate |
The proportion of actual negatives that are correctly identified as such (aka @see Specificity, @see Selectivity).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| TruePositive |
Number of correctly predicted positive values.
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| TruePositiveRate |
The proportion of actual positives that are correctly identified as such (aka @see Sensitivity, @see Recall, or probability of detection).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| TypeIError |
Number of negative values incorrectly predicted as positive (false alarm).
http://en.wikipedia.org/wiki/Confusion_matrix
| |
| TypeIIError |
Number of positive values incorrectly predicted as negative (miss).
http://en.wikipedia.org/wiki/Confusion_matrix
|
| Name | Description | |
|---|---|---|
| Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
| GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
| GetType | Gets the Type of the current instance. (Inherited from Object.) | |
| ToString | Returns a string that represents the current object. (Inherited from Object.) | |
| Update(Boolean, Boolean) |
Update confusion matrix form two observations
| |
| Update(IEnumerableBoolean, IEnumerableBoolean) |
Update confusion matrix from the data sequences
| |
| UpdateT(IEnumerableBoolean, IEnumerableT, FuncT, Boolean) |
Update confusion matrix from the data sequences
|