Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update to dpnp 0.9 #599

Merged
merged 13 commits into from
Oct 28, 2021
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ https://intelpython.github.io/dpnp/

* numba 0.54.* or 0.55.*
* dpctl 0.11.*
* dpnp 0.8.* (optional)
* dpnp 0.9.* (optional)
* llvm-spirv 11.* (SPIRV generation from LLVM IR)
* spirv-tools
* packaging
Expand Down
4 changes: 2 additions & 2 deletions conda-recipe/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,15 +20,15 @@ requirements:
- cython
- numba 0.54*|0.55*
- dpctl 0.10*|0.11*
- dpnp >=0.8* # [linux]
- dpnp 0.8*|0.9 # [linux]
PokhodenkoSA marked this conversation as resolved.
Show resolved Hide resolved
- wheel
run:
- python
- numba 0.54*|0.55*
- dpctl 0.10*|0.11*
- spirv-tools
- llvm-spirv 11.*
- dpnp >=0.8* # [linux]
- dpnp 0.8*|0.9* # [linux]
- packaging

test:
Expand Down
4 changes: 0 additions & 4 deletions docs/user_guides/debugging/debugging_environment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,6 @@ Configure debugging environment
conda create numba-dppy-dev numba-dppy
conda activate numba-dppy-dev

.. note::

Debugging features were tested with the following packages: ``numba-dppy=0.14``, ``dpctl=0.8``, ``numba=0.53``.

3) Activate NEO drivers (optional).

If you want to use the local NEO driver, activate the variables for it. See the :ref:`NEO-driver`.
Expand Down
2 changes: 1 addition & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ dependencies:
- cython
- numba 0.55*
- dpctl 0.11*
- dpnp 0.8*
- dpnp 0.9*
- spirv-tools
# - llvm-spirv 11.*
- packaging
Expand Down
1 change: 1 addition & 0 deletions numba_dppy/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -517,6 +517,7 @@ def main():

import numba.testing

from numba_dppy.interop import asarray
from numba_dppy.retarget import offload_to_sycl_device

from . import config
Expand Down
41 changes: 41 additions & 0 deletions numba_dppy/interop.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Copyright 2021 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""Support for interoperability."""

import dpctl.tensor as dpt


def asarray(container):
"""Convert container supported by interoperability to numba-dppy container.
Currently used dpctl.tensor.asarray().
"""
try:
return dpt.asarray(container)
except:
pass

# Workaround for dpnp_array if dpctl asarray() does not support it.
try:
from dpnp.dpnp_array import dpnp_array

if isinstance(container, dpnp_array) and hasattr(container, "_array_obj"):
import warnings

warnings.warn("asarray() uses internals from dpnp.")
return container._array_obj
except:
pass

raise NotImplementedError("dpctl asarray() does not support " + type(container))
6 changes: 3 additions & 3 deletions numba_dppy/tests/integration/test_dpnp_interop.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,8 +89,8 @@ def data_parallel_sum(a, b, c):
global_size = 1021

with dppy.offload_to_sycl_device(offload_device):
a = dpnp.arange(global_size, dtype=dtype)
b = dpnp.arange(global_size, dtype=dtype)
c = dpnp.ones_like(a)
a = dppy.asarray(dpnp.arange(global_size, dtype=dtype))
b = dppy.asarray(dpnp.arange(global_size, dtype=dtype))
c = dppy.asarray(dpnp.ones_like(a))

data_parallel_sum[global_size, dppy.DEFAULT_LOCAL_SIZE](a, b, c)