You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In tools/train.py on line 175 you create a cfg_u for unlabelled files. This function uses X_U and all_anns.
Now all annotations contains the annotation files for both the labeled set and unlabeled set. Why do I need the annotations for the unlabelled set ? Isn't this against the idea of active learning that you dynamically create the labels for unlabelled set and use them to enhance your model ?
The text was updated successfully, but these errors were encountered:
You can have a look at the loss function in lines 479 and 565 in mmdet/models/dense_head/MIAOD_head.py, which are belongs to the step re-weighting and minimizing/maximizing instance uncertainty. In these two steps, if y_loc_img is negative, the corresponding prediction of the model will be set to 0, which prevent it from participating in network training. And if y_loc_img is negative, it means that the corresponding data is from the unlabeled set X_U, referring to the lines 74 and 92 in epoch_based_runner.py. The normal y_loc_img, which is the annotation of the bounding box, should be positive in the original dataset. Therefore, we didn’t use any annotation of the unlabeled set X_U in model training.
In tools/train.py on line 175 you create a cfg_u for unlabelled files. This function uses X_U and all_anns.
Now all annotations contains the annotation files for both the labeled set and unlabeled set. Why do I need the annotations for the unlabelled set ? Isn't this against the idea of active learning that you dynamically create the labels for unlabelled set and use them to enhance your model ?
The text was updated successfully, but these errors were encountered: