Skip to content

TRUNCATE token less than 512 #1574

@khaddeep

Description

@khaddeep

Describe the bug
Specified max sequence to 512 but still it is truncating while prediction

To Reproduce
Take a paragraph longer than 128 tokens

Expected behavior
It should not truncate if max sequence is specified to 512

Screenshots
image

Desktop (please complete the following information):

  • OS

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    staleThis issue has become stale

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions