Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mac系统下部署安装完成的问题 #3873

Closed
1 task done
zhug-e opened this issue May 23, 2024 · 4 comments
Closed
1 task done

Mac系统下部署安装完成的问题 #3873

zhug-e opened this issue May 23, 2024 · 4 comments
Labels
solved This problem has been already solved

Comments

@zhug-e
Copy link

zhug-e commented May 23, 2024

Reminder

  • I have read the README and searched the existing issues.

Reproduction

1

Expected behavior

1

System Info

M1芯片 MacOS 14.5

Others

启动加载模型都成功,在问答的时候报错如下:
Exception in thread Thread-8:
Traceback (most recent call last):
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/threading.py", line 973, in _bootstrap_inner
self.run()
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/threading.py", line 910, in run
self._target(*self._args, **self._kwargs)
File "/Library/Python/3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/Users/zhugang/Library/Python/3.9/lib/python/site-packages/transformers/generation/utils.py", line 1569, in generate
model_kwargs["attention_mask"] = self._prepare_attention_mask_for_generation(
File "/Users/zhugang/Library/Python/3.9/lib/python/site-packages/transformers/generation/utils.py", line 468, in _prepare_attention_mask_for_generation
raise ValueError(
ValueError: Can't infer missing attention mask on mps device. Please provide an attention_mask or use a different device.

@DarthPenguinz
Copy link

@zhug-e Hi, any fixes for this yet? am facing the same issue.

@hiyouga hiyouga added bug Something isn't working pending This problem is yet to be addressed labels May 24, 2024
@gklab
Copy link

gklab commented May 26, 2024

  1. find File "/Users/zhugang/Library/Python/3.9/lib/python/site-packages/transformers/generation/utils.py", line 468
  2. Comment out this detection judgment

It can be executed normally with the latest mps version of pytorch.

@zhug-e
Copy link
Author

zhug-e commented May 27, 2024 via email

@hiyouga hiyouga closed this as completed in 91611d6 Jun 3, 2024
@hiyouga hiyouga added solved This problem has been already solved and removed bug Something isn't working pending This problem is yet to be addressed labels Jun 3, 2024
@hiyouga
Copy link
Owner

hiyouga commented Jun 3, 2024

fixed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

4 participants