libauc.metrics

libauc.metrics.metrics

auc_prc_score(y_true, y_pred, reduction='mean', **kwargs)[source]

Evaluation function of AUPRC

auc_roc_score(y_true, y_pred, reduction='mean', **kwargs)[source]

Evaluation function of AUROC

evaluator(y_true, y_pred, metrics=['auroc', 'auprc', 'pauroc'], return_str=False, format='%.4f(%s)', **kwargs)[source]
pauc_roc_score(y_true, y_pred, max_fpr=1.0, min_tpr=0.0, reduction='mean', **kwargs)[source]

Evaluation function of pAUROC

libauc.metrics.metrics_k

ap_at_k(y_true, y_pred, k=10)[source]

Evaluation function of AveragePrecision@K

check_array_shape(array, shape)[source]
check_array_type(array)[source]
map_at_k(y_true, y_pred, k=10)[source]

Evaluation function of meanAveragePrecision@K

ndcg_at_k(y_true, y_pred, k=5)[source]

Evaluation function of NDCG@K

precision_and_recall_at_k(y_true, y_pred, k, pos_label=1, **kwargs)[source]
precision_at_k(y_true, y_pred, k, pos_label=1, **kwargs)[source]

Evaluation function of Precision@K

recall_at_k(y_true, y_pred, k, pos_label=1, **kwargs)[source]

Evaluation function of Recall@K