Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BFloat16 Mixed Precision Training Support For Arc Alchemist? If Not Could Float16 Training PLEASE Be Supported? #258

Closed
tedliosu opened this issue Nov 13, 2022 · 2 comments
Labels

Comments

@tedliosu
Copy link

Basically this question but I'd like to know the answer to the question for IPEX as well 😄

Please let me know if there's any additional info I need to provide for my question to be answered. 👍

@jingxu10
Copy link
Contributor

jingxu10 commented Nov 15, 2022

XMX is enabled via oneDNN. Since its architecture is similar to that of Flex Series, it should work on Arc as well. However we haven't validated it on Arc and don't provide official supports at this moment.

@tedliosu
Copy link
Author

@jingxu10 OK thank you for letting me know; I may try it out once I get my hands on an Arc GPU hopefully sometime in the near future 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants