-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add new blas extension and update dpnp.matmul func #1616
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
View rendered docs @ https://intelpython.github.io/dpnp/index.html |
vtavana
force-pushed
the
update_matmul
branch
from
November 19, 2023 19:53
8b5f3b6
to
b8f7f00
Compare
npolina4
requested changes
Dec 7, 2023
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The result does not match with numpy
import numpy, dpnp
def init():
N = 5
A = numpy.empty((N, N), dtype=numpy.float32)
for i in range(N):
A[i, : i + 1] = numpy.fromfunction(lambda j: (-j % N) / N + 1, (i + 1,))
A[i, i + 1 :] = 0.0
A[i, i + 1 :] = 0.0
return A
def kernel(A):
for i in range(A.shape[0]):
for j in range(i):
A[i, j] -= A[i, :j] @ A[:j, j]
a = init()
b = dpnp.asarray(a)
kernel(a)
kernel(b)
print(a - b.asnumpy())
[[0. 0. 0. 0. 0. ]
[0. 0. 0. 0. 0. ]
[0. 0. 0. 0. 0. ]
[0. 0. 0. 0. 0. ]
[0. 0. 0. 1.6 0. ]]
npolina4
approved these changes
Dec 14, 2023
antonwolfy
reviewed
Dec 22, 2023
antonwolfy
reviewed
Jan 11, 2024
antonwolfy
approved these changes
Jan 12, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @vtavana !
github-actions bot
added a commit
that referenced
this pull request
Jan 13, 2024
* Add new blas extension and update matmul impl * Add support for N-D array add N-dimension * support more special cases + add new tests * fix random behavior on cpu * correct dtypes + support more keywords * add strided support * check input arrays * address comments - first round * address comments - second round * address comments - third round * fix pre-commit * improve test coverage * address comments * update _gemm_res_dtype func * fix a test for result_type * fix minor issues * skip tests for matmul --------- Co-authored-by: Vahid Tavanashad <[email protected]> Co-authored-by: vtavana <[email protected]> Co-authored-by: Anton <[email protected]> f95ceb9
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR updates
dpnp.matmul
implementation withgemm
function call fromoneapi::mkl::blas
added todpnp.backend.extensions.blas
.In addition,
gemm_batch
function is also added which is the batched versions ofgemm
, performing multiplegemm
operations in a single call.The implementation is written as a pybind11 extension above required BLAS functions.