Complex Metrics¶
The following metrics accept only real valued for y_true
.
If y_pred
is real, it will converge to TensorFlow implementations.
If not, it will cast y_pred
to real by making y_pred = (tf.math.real(y_pred) + tf.math.imag(y_pred)) / 2
.
Available metrics
ComplexAccuracy
: Complex implementation of AccuracyComplexCategoricalAccuracy
: Complex implementation of CategoricalAccuracyComplexPrecision
: Complex implementation of PrecisionComplexRecall
: Complex implementation of RecallComplexCohenKappa
: Complex implementation of CohenKappaComplexF1Score
: Complex implementation of F1Score

update_state
(self, y_true, y_pred, sample_weight=None, ignore_unlabeled=True)¶ Parameters:  y_true – Ground truth label values.
 y_pred – The predicted probability values.
 sample_weight – Optional
sample_weight
acts as a coefficient for the metric. If a scalar is provided, then the metric is simply scaled by the given value. Ifsample_weight
is a tensor of size[batch_size]
, then the metric for each sample of the batch is rescaled by the corresponding element in thesample_weight
vector. If the shape ofsample_weight
is[batch_size, d0, .. dN1]`
(or can be broadcasted to this shape), then each metric element of y_pred is scaled by the corresponding value ofsample_weight
. (Note on dN1: all metric functions reduce by 1 dimension, usually the last axis(1)
).  ignore_unlabeled – Default
True
. Ignore cases wherelabels[1] == zeros
. Thesample_weight
parameter is used to ignore unlabeled data so using this will deprect thesample_weight
parameter.
Warning
ignore_unlabeled
takes precedence over sample_weight
so make sure to turn it to False
when using sample_weight
Complex Average Accuracy¶
Average Accuracy (AA) is defined as the average of individual class accuracy. This is used for unbalanced dataset in order to see the actual accuracy per class.
For example:
# Unbalanced dataset with 90% cases of one class and 10% of the other
y_true = np.array([[1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [0., 1.] ])
# Dummy classifier has learned to just predict always the first class
y_pred = np.array([ [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.] ])
m = ComplexCategoricalAccuracy()
m.update_state(y_true, y_pred)
print(m.result().numpy()) # The dummy classifier has a big accuracy of 90%
>>> 0.9
m = ComplexAverageAccuracy()
m.update_state(y_true, y_pred)
print(m.result().numpy()) # But an average accuracy of just 50%
>>> 0.5