R/network_evaluation.R
calculate_metrics.Rd
Calculates comprehensive performance metrics for evaluating predicted network structures, including classification performance, precision-recall metrics, and network topology metrics.
calculate_metrics(
network_table,
ground_truth,
type = "all",
plot = FALSE,
line_color = "#1563cc",
line_width = 1
)
A data frame of predicted network structure containing:
regulator
- Source nodes of the network edges
target
- Target nodes of the network edges
weight
- Edge weights representing prediction confidence
A data frame of ground truth network with the same format as network_table
.
The type of metric to return, default is all
.
This can take any of the following choices:
all
- Returns all available metrics
auroc
- Area Under ROC Curve, evaluating overall classification ability
auprc
- Area Under Precision-Recall Curve, focusing on positive class prediction
precision
- Proportion of correct predictions among positive predictions
recall
- Proportion of actual positives correctly identified
f1
- Harmonic mean of precision and recall
si
- Set Intersection, counting correctly predicted edges
ji
- Jaccard Index, measuring overlap between predicted and true networks
Logical value, default is FALSE
, whether to generate visualization plots.
If TRUE
, generates:
ROC curve plot for AUROC evaluation
Precision-Recall curve plot for AUPRC evaluation
Confusion matrix heatmap for classification results
Network comparison plot showing edge overlap
Metrics summary bar plot
The color for plot lines, default is #1563cc
.
The width for plot lines, default is 1
.
If type="all": A data frame with all metrics
If specific type: A single numeric value with the requested metric
If plot=TRUE: Displays relevant visualization plots
data("example_matrix")
data("example_ground_truth")
inferCSN(example_matrix) |>
calculate_metrics(example_ground_truth, plot = TRUE)
#> ✔ Running for <dense matrix>.
#> ✔ Checking input parameters.
#> ✔ Using L0 sparse regression model.
#> ✔ Using 1 core.
#> ✔ Run done.
#> AUROC AUPRC ACC Precision Recall F1 JI SI
#> 1 0.963 0.495 0.944 1 0.514 0.679 0.5 18