Access Compare Annotations
Compare Annotations tool uses DeepView-Validator (Validator) to evaluate two annotation sets. To access the Compare Annotations page, select the fifth icon on the left side bar.
Filters
Ground Truth
Select one annotation set as the ground truth to be compared to.
Target Set
Select one annotation set as the target to compare to the ground truth.
IoU Threshold
This is the validation IoU threshold to consider true positives. Validator will match predictions to ground truth using the highest IoU. If the matches have the same labels and the IoU is greater than or equal to the set IoU threshold, then these matches are true positives. However, if the matches have the same labels, but the IoU is less than the set IoU threshold, then the matches are considered as localization false positives. Any mismatching labels are considered as classification false positives. Default validation IoU threshold is 0.50.
Confidence Threshold
This is the validation score threshold that will filter the predictions to only validate the predictions with score greater than or equal than the set threshold. Default validation score threshold is 0.25.
Metric
This option will specify which metric to consider when performing detection to ground truth matches. By default, validator will use 'iou' to match the detection to ground truth based on highest calculated IoU. Another option is 'centerpoint' which will measure the distances between centers of bounding boxes and the smallest distances will get matched.
Ignore Boxes
This option will ignore any bounding boxes from the detection and ground truth with height or width less than this value in pixels.
Generate the Report
Click on 'Generate Report' on the right side of the banner to generate a report with the selected filters. A sample report is shown below.
Overview
This table summarizes the static fields between the ground truth set and the currently selected target set. The data only updates when a new ground truth or new target is selected.
Threshold Comparison
This table details the output that varies between different IoU threshold and confidence threshold. To see the evaluation report between the ground truth and target at a different IoU or confidence, simply adjust the slider for these filters. The new report will be generated as a new row under the Threshold Comparison like below. Users can select a new row to view the Precision-Recall Curve and the Confusion Matrix for that specific report.
Comments
0 comments
Please sign in to leave a comment.