You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for developing trt-llm. It's helping me a lot
I'm trying to use medusa with trt-llm, referencing this page
It's working fine with vicuna 7B and its medusa heads, with no errors at all.
However, when implementing with vicuna 33B and its trained heads, the following error occurs when executing trtllm-build
converting checkpoint with medusa was done with following result
concurrent.futures.process._RemoteTraceback:
"""
Traceback (most recent call last):
File "/usr/lib/python3.10/multiprocessing/queues.py", line 244, in _feed
obj = _ForkingPickler.dumps(obj)
File "/usr/lib/python3.10/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
AttributeError: Can't pickle local object 'MedusaConfig.__init__.<locals>.GenericMedusaConfig'
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/tensorrt_llm/commands/build.py", line 437, in parallel_build
future.result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/usr/lib/python3.10/multiprocessing/queues.py", line 244, in _feed
obj = _ForkingPickler.dumps(obj)
File "/usr/lib/python3.10/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
AttributeError: Can't pickle local object 'MedusaConfig.__init__.<locals>.GenericMedusaConfig'
The text was updated successfully, but these errors were encountered:
SoundProvider
changed the title
Medusa example with vicuna 33B
Medusa example fails with vicuna 33B
Nov 22, 2024
SoundProvider
changed the title
Medusa example fails with vicuna 33B
[bug] Medusa example fails with vicuna 33B
Nov 22, 2024
Thank you for developing trt-llm. It's helping me a lot
I'm trying to use medusa with trt-llm, referencing this page
It's working fine with vicuna 7B and its medusa heads, with no errors at all.
However, when implementing with vicuna 33B and its trained heads, the following error occurs when executing
trtllm-build
converting checkpoint with medusa was done with following result
The text was updated successfully, but these errors were encountered: