Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running demo on custom images #347

Closed
turinaf opened this issue Jan 16, 2023 · 5 comments
Closed

Running demo on custom images #347

turinaf opened this issue Jan 16, 2023 · 5 comments

Comments

@turinaf
Copy link

turinaf commented Jan 16, 2023

Hi @layumi, thank you for the great repo.

I have run test.py and got extracted features saved as multi_query.mat and pytorch_result.mat
The following was the result of running test.py:

image

To use custom images (cropped detections, say from Yolo), do we need to have big dataset and prepare it with prepare.y like we did for market1501? I was wondering if we can extract features and return top matching images among the few detections we passed to the demo.
Thank you!

@turinaf turinaf changed the title Extracting features using test.py Running demo on custum images Jan 19, 2023
@turinaf turinaf changed the title Running demo on custum images Running demo on custom images Jan 19, 2023
@layumi
Copy link
Owner

layumi commented Jan 26, 2023

Hi, @turinaf

  1. Yes. Usually training the model on your dataset will achieve better performance, which is closer to your test set.

  2. Yes. We need to extract the feature on the fly, which is similar to test.py . And then you compare them and rank them.

@turinaf turinaf closed this as completed Jan 27, 2023
@turinaf turinaf reopened this Jan 28, 2023
@turinaf
Copy link
Author

turinaf commented Feb 1, 2023

Thank you @layumi ,
I encountered some problems as my custom data (very very small, less than 50 images ) is not exactly like market1501 dataset and don't have labels and camera numbers.
I modified test.py file and saved custom features .mat file with only gallery_f and query_f. I also wrote another demo code to see the result.
Now, it is working as I intended. Thank you!

@turinaf turinaf closed this as completed Feb 1, 2023
@nikky4D
Copy link

nikky4D commented Feb 2, 2023

Would you have code to share? I would like to see ho you set up the demo so I can do the same for my own test set

@turinaf
Copy link
Author

turinaf commented Feb 3, 2023

First you need to modify test.py to save result (extracted features) the way you want. For me, I organized my custom data as follows.

custom_data
    - gallery
         - 001
           - image0.jpg
    - query
        - 001
           - image0.jpg

And the images don't have label and camera ID. So, I saved only gallery_features and query_features by modifying test.py result line code as follows:

result = {'gallery_f': gallery_feature.numpy(), 'query_f': query_feature.numpy()}
scipy.io.savemat('features.mat',result)

I removed those lines of code using query_cam, query_label, same for gallery_cam & gallery_label. My data also doen't have class names. So I commented out the code which is getting class_names.
I am not using multiple query and those codes for multi query condition are commented out.

Another problem I encountered when running test.py even with Market1501 dataset was the use of options like use_NAS, use_swin, use_swinv2 and use_convnext. So I set them to False:

opt.use_NAS = False
opt.use_swin = False
opt.use_swinv2 = False
opt.use_convnext = False

I used ft_ResNet50 pretrained model , let me know if am missing something about using those options

For Demo

I just copied demo.py and saved as custom_demo.py and modified it.
Load the gallery_features and query_features only and get rid of those gallery/query label and cam.

result = scipy.io.loadmat('features.mat')
query_feature = torch.FloatTensor(result['query_f'])
gallery_feature = torch.FloatTensor(result['gallery_f'])

for sort_img() function I passed only qf and gf
The modified function is as follows.

def sort_img(qf, gf,):
    query = qf.view(-1,1)
    # print(query.shape)
    score = torch.mm(gf,query)
    score = score.squeeze(1).cpu()
    score = score.numpy()
    # predict index
    index = np.argsort(score)  #from small to large
    index = index[::-1]
    return index

i = opts.query_index
index = sort_img(query_feature[i], gallery_feature)

For visualization, you can make it display top 2, 3 or even one matching image. I visualized top 3 images.

I did this through trial and error. I hope the info I provided will help or at least give you a hint to try it on your custom data. Try modifying the codes to meet your needs.

Let me know if you have different suggestion.

@a2082761
Copy link

a2082761 commented Feb 6, 2024

@turinaf
How do you open custom_demo.py? To open demo python file, we need index as below.
python demo.py --query_index 777
However, query_index means gallery label and query label I guess....
Please let me know how to call custom_demo python file without query_index.
Thanks in advance!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants