-
Notifications
You must be signed in to change notification settings - Fork 709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix benchmark #1155
Fix benchmark #1155
Conversation
I now also added test that cover code to obtain throughput from sweep. |
Codacy generates one warning. This happens since in my tests |
@blaz-r thanks for addressing this. If you have to use a fixture across tests it is adviced to place them in |
The thing is that |
I now added conftest which contains path to file where needed fixture is defined. I think this should be good now. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am fine with these changes. Our entire test suit is going to undergo a refactor so the conftest workaround is temporary anyways. Thanks again
Well, as it turns out conftest can't have pytest_plugin inside, so the tests are failing. From what I see this is not a good idea at all, so I think I should make conftest with this fixture inside the pre_merge module or just copy it to conftest of sweep. I'm not sure which option would really be the best? |
I just put slightly modified code into conftest. This way it should work and pytest shouldn't have problems. |
@samet-akcay all check are now passing. |
Description
There was a problem where TorchInferencer was changed to no longer take config and model, but rather path to model. This also reflected in functions used to benchmark torch models. Required change was therefore update of calling parameters for
get_torch_throughput
and also adding instruction to export torch model to be used.Changes
Checklist