Skip to content

Conversation

@ChenWu98
Copy link

@ChenWu98 ChenWu98 commented Oct 8, 2024

Problem description: the current preprocessing function will squeeze length-1 sequences to a 0-dim tensor, which will cause an error in pytorch dataloaders.
Solution: replaced each squeeze() with squeeze(0) in seq2seq_utils.py

@stale
Copy link

stale bot commented Oct 19, 2025

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale This issue has become stale label Oct 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

stale This issue has become stale

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant