-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update onednn #15632
Update onednn #15632
Conversation
@EgorDuplensky We have to options here: I can direcly merge this above oneDNN 2.7 or you can include this change into your oneDNN 3.0 migration PR. Up to you. |
I will correct this on oneDNN 3.0 migration branch by fixing the original commit then. |
@dmitry-gorokhov Done |
@dmitry-gorokhov @RICKIE777 |
Thanks @dmitry-gorokhov and @EgorDuplensky. |
The idea is to add a single_layer_test instance which reproduces the issue you observe with the model. |
Thanks @EgorDuplensky. I have a question about it. The kernel is jit_int8:avx512_core_vnni. The modified file is jit_avx512_core_x8s8s32_conv_kernel.cpp. |
The modification has been merged into onednn v3.0_for_ie_master_migration. The onednn team has not yet merged int8 single-layer tests. For the test case of this PR, I have submitted a ticket https://jira.devtools.intel.com/browse/MFDNN-9592. So this PR can be closed. |
Details: