Skip to content

Commit

Permalink
Merge branch 'vllm-project:main' into mypy
Browse files Browse the repository at this point in the history
  • Loading branch information
frreiss authored Jan 10, 2025
2 parents 2c6e60d + 8a57940 commit 6cc5a60
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 4 deletions.
3 changes: 2 additions & 1 deletion benchmarks/benchmark_prefix_caching.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@
--model meta-llama/Llama-2-7b-chat-hf \
--enable-prefix-caching \
--num-prompts 1 \
--repeat-count 100
--repeat-count 100 \
--input-length-range 128:256
ShareGPT example usage:
# This command samples 20 prompts with input lengths
Expand Down
3 changes: 0 additions & 3 deletions vllm/model_executor/models/deepseek_v3.py
Original file line number Diff line number Diff line change
Expand Up @@ -639,9 +639,6 @@ def load_weights(self, weights: Iterable[Tuple[str,
if is_pp_missing_parameter(name, self):
continue

if name not in params_dict:
for key in params_dict:
print(key)
param = params_dict[name]
weight_loader = getattr(param, "weight_loader",
default_weight_loader)
Expand Down

0 comments on commit 6cc5a60

Please sign in to comment.