Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

b580 support on linux #764

Closed
SHoogstad opened this issue Jan 4, 2025 · 6 comments
Closed

b580 support on linux #764

SHoogstad opened this issue Jan 4, 2025 · 6 comments
Assignees

Comments

@SHoogstad
Copy link

SHoogstad commented Jan 4, 2025

i am running comfyui on a unraid server and just gotten a b580 to put in there i saw there was already support on windows for it, so my question is what is the expected eta for that

@ZailiWang ZailiWang self-assigned this Jan 6, 2025
@ZailiWang
Copy link
Contributor

Hi, Linux support of Battlemage requires some driver-level updates, so far we don't have an ETA. Please keep posted, once there is any deliverable message, we will let you know. Thanks.

@Qubitium
Copy link

Qubitium commented Jan 8, 2025

@ZailiWang is the IPEX init_quant_linear slow down issue related? We are able to use B580 fine on linux doing xpu model inference but first pass is super slow due to kernel compilation.

#767

@ZailiWang
Copy link
Contributor

Hi @Qubitium, yes the slowness of 1st pass should be the lack of AOT, please try with source code building and set AOT parameter as 'bmg'. But please note that since it's not officially supported, you might encounter with functionality or performance issues.

@Qubitium
Copy link

Qubitium commented Jan 10, 2025

@ZailiWang Intel documentation is broken, doesn't compile due to AOT compiler not found (OCLOC) for AOT compile has broken links:

https://www.intel.com/content/www/us/en/docs/dpcpp-cpp-compiler/developer-guide-reference/2023-0/ahead-of-time-compilation.html <-- instruction does not work. Will fail due to missing OCLOC aot compiler

https://www.intel.com/content/www/us/en/develop/documentation/installation-guide-for-intel-oneapi-toolkits-linux/top/installation/install-opencl-offline-compiler-ocloc.html <--- this link for ocloc compiler doesn't exists and our build stage is not working.

[31/742] Building SYCL device link file test_sycl/CMakeFiles/test_..._build_standalone.dir/test_sycl_build_standalone_sycl_device_obj.
FAILED: test_sycl/CMakeFiles/test_sycl_build_standalone.dir/test_sycl_build_standalone_sycl_device_obj.o /workspace/pytorch/build/test_sycl/CMakeFiles/test_sycl_build_standalone.dir/test_sycl_build_standalone_sycl_device_obj.o 
cd /workspace/pytorch/build/test_sycl && /opt/intel/oneapi/compiler/2025.0/bin/icpx -fPIC -fsycl -fsycl-targets=spir64_gen,spir64 -fno-sycl-unnamed-lambda -sycl-std=2020 -fhonor-nans -fhonor-infinities -fno-associative-math -fno-approx-func -Wno-absolute-value -no-ftz -D__INTEL_PREVIEW_BREAKING_CHANGES -D_GLIBCXX_USE_CXX11_ABI=1 -fsycl-fp64-conv-emu -fsycl-max-parallel-link-jobs=16 -fsycl-targets=spir64_gen,spir64 -fsycl-link /workspace/pytorch/build/test_sycl/CMakeFiles/test_sycl_build_standalone.dir//./test_sycl_build_standalone_generated_simple_kernel.cpp.o -Xs "-device\ pvc,xe-lpg,ats-m150\ -options\ '\ -cl-poison-unsupported-fp64-kernels\ -cl-intel-enable-auto-large-GRF-mode\ -cl-fp32-correctly-rounded-divide-sqrt'" -o /workspace/pytorch/build/test_sycl/CMakeFiles/test_sycl_build_standalone.dir/./test_sycl_build_standalone_sycl_device_obj.o
icpx: warning: ocloc tool could not be found and is required for AOT compilation. See: https://www.intel.com/content/www/us/en/develop/documentation/oneapi-dpcpp-cpp-compiler-dev-guide-and-reference/top/compilation/ahead-of-time-compilation.html for more information [-Waot-tool-not-found]
llvm-foreach: No such file or directory
icpx: error: gen compiler command failed with exit code 1 (use -v to see invocation)
ninja: build stopped: subcommand failed.

@Qubitium
Copy link

@ZailiWang We even tried the official oneapi docker which contains all the prereq packagers and build doesn't work:

docker run --name intel -it \
  --device /dev/dri \
  --ipc=host \
  -v /opt/docker2:/workspace \
  intel/oneapi-basekit:latest \
  /bin/bash -l

The ocloc appears to be inside the intel docker but the builder doesn't find it:

(base) root@1d7e1514dd9e:/workspace# find /opt -name ocloc
/opt/intel/oneapi/vtune/2025.0/bin64/gma/GTPin/Profilers/ocloc
/opt/intel/oneapi/vtune/2025.0/bin64/gma/GTPin/Profilers/ocloc/Bin/intel64/ocloc

@SHoogstad
Copy link
Author

@ZailiWang is there now support for battlemage for ipex?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants