Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doubts About the Sparsity of Detectors #38

Closed
noob-dqt opened this issue Dec 2, 2024 · 0 comments
Closed

Doubts About the Sparsity of Detectors #38

noob-dqt opened this issue Dec 2, 2024 · 0 comments

Comments

@noob-dqt
Copy link

noob-dqt commented Dec 2, 2024

Hello author, thank you for your excellent work. I have a question about the "sparsity". I noticed in the source code that in the detection head part, sparse features are transformed into dense features, specifically x_flatten = x.dense().view(batch_size, x.features.shape[1], -1) # [B, C, H*W]. This x_flatten is also used in the decoder part. Personally, I feel that if BEV feature maps are involved, does this still qualify as sparsity? I wanted to ask, which specific part does the term 'sparse detection head' refer to in the article? Does it refer to this type of detection in the DETR paradigm?

@noob-dqt noob-dqt closed this as completed Jan 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant