Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tests] skip bnb-related tests instead of failing on xpu #2860

Merged
merged 9 commits into from
Jun 18, 2024

Conversation

faaany
Copy link
Contributor

@faaany faaany commented Jun 17, 2024

What does this PR do?

When running pytest tests/test_accelerator.py on XPU, there are 4 tests failed:

FAILED tests/test_accelerator.py::AcceleratorTester::test_accelerator_bnb - RuntimeError: No GPU found. A GPU is needed for quantization.
FAILED tests/test_accelerator.py::AcceleratorTester::test_accelerator_bnb_cpu_error - RuntimeError: No GPU found. A GPU is needed for quantization.
FAILED tests/test_accelerator.py::AcceleratorTester::test_accelerator_bnb_multi_device - ValueError: xpu is not supported in test_accelerator_bnb_multi_device.
FAILED tests/test_accelerator.py::AcceleratorTester::test_accelerator_bnb_multi_device_no_distributed - RuntimeError: No GPU found. A GPU is needed for quantization.

The reason for all 4 tests is the same: all tests use load_in_8bit=True when loading the model with AutoModelForCausalLM.from_pretrained, which will lead to the torch.cuda.is_available() check in hf_quantizer.validate_environment, e.g.

___________________________________________ AcceleratorTester.test_accelerator_bnb ____________________________________________

self = <test_accelerator.AcceleratorTester testMethod=test_accelerator_bnb>

    @require_non_torch_xla
    @slow
    @require_bnb
    def test_accelerator_bnb(self):
        """Tests that the accelerator can be used with the BNB library."""
        from transformers import AutoModelForCausalLM
    
>       model = AutoModelForCausalLM.from_pretrained(
            "EleutherAI/gpt-neo-125m",
            load_in_8bit=True,
            device_map={"": 0},
        )

tests/test_accelerator.py:439: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../miniforge3/envs/acc-ut-ww23/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py:563: in from_pretrained
    return model_class.from_pretrained(
../../miniforge3/envs/acc-ut-ww23/lib/python3.9/site-packages/transformers/modeling_utils.py:3202: in from_pretrained
    hf_quantizer.validate_environment(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <transformers.quantizers.quantizer_bnb_8bit.Bnb8BitHfQuantizer object at 0x7f2cef87ceb0>, args = ()
kwargs = {'device_map': {'': 0}, 'from_flax': False, 'from_tf': False, 'torch_dtype': None}

    def validate_environment(self, *args, **kwargs):
        if not torch.cuda.is_available():
>           raise RuntimeError("No GPU found. A GPU is needed for quantization.")
E           RuntimeError: No GPU found. A GPU is needed for quantization.

../../miniforge3/envs/acc-ut-ww23/lib/python3.9/site-packages/transformers/quantizers/quantizer_bnb_8bit.py:62: RuntimeError

Since BNB doesn't support XPU and NPU yet, we should skip these tests rather than let them fail. I also checked the original PR #2343, which makes those tests device-agnostic. These 4 tests are overlooked because the author doesn't install bitsandbytes. But when users install the libraries for testing using pip install -e ".[testing]", the bitsandbytes library will be installed. So we should change the test maker to require_cuda, instead of making them device-agnostic, at least before the time when XPU and NPU are supported in BNB.

@muellerzr @SunMarc

@faaany
Copy link
Contributor Author

faaany commented Jun 17, 2024

add one more test: tests/test_big_modeling.py::BigModelingTester::test_dipatch_model_fp4_simple

@faaany faaany marked this pull request as draft June 17, 2024 05:58
Copy link
Collaborator

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! cc @SunMarc

@muellerzr muellerzr requested a review from SunMarc June 17, 2024 07:17
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ! Left a few comments

tests/test_accelerator.py Outdated Show resolved Hide resolved
tests/test_accelerator.py Outdated Show resolved Hide resolved
@faaany faaany marked this pull request as ready for review June 18, 2024 01:39
Copy link
Collaborator

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@muellerzr muellerzr merged commit 4cc3530 into huggingface:main Jun 18, 2024
23 checks passed
@faaany faaany deleted the bnb_tests branch November 4, 2024 06:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants