Skip to content

Commit 3b4dc8c

Browse files
authored
text_generation: improve parameters check (#1527)
1 parent eb340ba commit 3b4dc8c

1 file changed

Lines changed: 2 additions & 1 deletion

File tree

  • optimum/habana/transformers/generation

optimum/habana/transformers/generation/utils.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1062,9 +1062,10 @@ def generate(
10621062
)
10631063
if model_kwargs["reduce_recompile"]:
10641064
assert generation_config.bucket_size
1065-
# Below condition checked explicitly since llama supports bucket_internal even without reuse_cache
1065+
# Below condition checked explicitly since some models (like llama and gpt_bigcode) support bucket_internal even without reuse_cache
10661066
if generation_config.bucket_internal:
10671067
assert generation_config.bucket_size >= 0, "please set bucket_size to use bucket_internal"
1068+
assert generation_config.use_cache, "please set use_cache flag to use bucket_internal"
10681069
if generation_config.reuse_cache:
10691070
assert (
10701071
self.config.model_type

0 commit comments

Comments
 (0)