File tree Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Original file line number Diff line number Diff line change @@ -1200,7 +1200,7 @@ def generate(
12001200 input_ids_seq_length = input_ids .shape [- 1 ]
12011201 if max_length is None and max_new_tokens is None :
12021202 warnings .warn (
1203- "Neither `max_length` nor `max_new_tokens` have been set, `max_length` will default to "
1203+ "Neither `max_length` nor `max_new_tokens` has been set, `max_length` will default to "
12041204 f"{ self .config .max_length } (`self.config.max_length`). Controlling `max_length` via the config is "
12051205 "deprecated and `max_length` will be removed from the config in v5 of Transformers -- we recommend "
12061206 "using `max_new_tokens` to control the maximum length of the generation." ,
Original file line number Diff line number Diff line change 220220 input_ids (`torch.LongTensor` of shape `(batch_size, sequence_length)`):
221221 Indices of input sequence tokens in the vocabulary.
222222
223- IIndices can be obtained using [`FSTMTokenizer`]. See [`PreTrainedTokenizer.encode`] and
223+ Indices can be obtained using [`FSTMTokenizer`]. See [`PreTrainedTokenizer.encode`] and
224224 [`PreTrainedTokenizer.__call__`] for details.
225225
226226 [What are input IDs?](../glossary#input-ids)
You can’t perform that action at this time.
0 commit comments