Skip to content

Commit 241a48c

Browse files
44670jimpang
authored andcommitted
[FIX] Fix a bug in initializing Yarn RoPE (vllm-project#2983)
1 parent 18600ce commit 241a48c

File tree

1 file changed

+2
-4
lines changed

1 file changed

+2
-4
lines changed

vllm/model_executor/layers/rotary_embedding.py

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -245,13 +245,11 @@ def _yarn_find_correction_range(low_rot: int,
245245

246246

247247
def _yarn_linear_ramp_mask(low: float, high: float, dim: int,
248-
dtype: torch.dtype,
249-
device: torch.device) -> torch.Tensor:
248+
dtype: torch.dtype) -> torch.Tensor:
250249
if low == high:
251250
high += 0.001 # Prevent singularity
252251

253-
linear_func = (torch.arange(dim, dtype=dtype, device=device) -
254-
low) / (high - low)
252+
linear_func = (torch.arange(dim, dtype=dtype) - low) / (high - low)
255253
ramp_func = torch.clamp(linear_func, 0, 1)
256254
return ramp_func
257255

0 commit comments

Comments
 (0)