Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can i control the flops of the models searched? #3

Open
zyc4me opened this issue May 6, 2019 · 3 comments
Open

Can i control the flops of the models searched? #3

zyc4me opened this issue May 6, 2019 · 3 comments

Comments

@zyc4me
Copy link

zyc4me commented May 6, 2019

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

@chenxin061
Copy link
Owner

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

This is currently not an option in our code.
Actually, most differentiable NAS methods do not support controlling the FLOPs to a specific range.
However, you can adjust hyper-parameters during evaluation scenario. For example, you can apply small --layers and --initial_channels to reduce the FLOPs into the range 60~80M with input size 224*224.

@198808xc
Copy link
Collaborator

198808xc commented May 6, 2019

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

A nice question!

Here I provide another answer. Note that when you perform weighted average over a few operators, you can also compute the expected FLOPS of the overall network. If you hope to reduce FLOPS, you can add this number as an additional loss term. This does not impact the differentiability of the entire framework, so the only thing you have to do is to set a proper balancing parameter.

Hope this helps.

@Catosine
Copy link

Catosine commented Jun 1, 2019

thank you for the code, i have a problom, how can i control the flops? for example, i want a model with input size 224*224, flops 60~80M, can you teach me how to set the search param? thanks!

You may go with this paper below

You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization

Where they raised an interesting way to limit Flops and memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants