Skip to content

SpeechLM2 SALM: load ckpt faster, with less GPU memory#14113

Merged
pzelasko merged 4 commits intomainfrom
speechlm2-skip-pretrained-asr
Jul 3, 2025
Merged

SpeechLM2 SALM: load ckpt faster, with less GPU memory#14113
pzelasko merged 4 commits intomainfrom
speechlm2-skip-pretrained-asr

Conversation

@pzelasko
Copy link
Collaborator

@pzelasko pzelasko commented Jul 2, 2025

What does this PR do ?

Loads speechlm2 checkpoints faster and requiring 4x less peak GPU memory during init. Previously a SALM ckpt that required ~20GB during init, now only requires ~5GB.

Collection: speechlm2

Changelog

  • For checkpoints converted to safetensors, skips loading original weights of the pretrained ASR module
  • Safetensors checkpoints are read into CPU first, converted to bfloat16, and then put on CUDA to minimize memory usage. After the change, only ~5GB RAM is required for a 2.5B bfloat16 SALM model. Eventually we might want to optimize this further with low-RAM direct GPU loading.

Usage

  • You can potentially add a usage example below
# Add a code snippet demonstrating how to use this 

GitHub Actions CI

The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.

The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

zhehuaichen
zhehuaichen previously approved these changes Jul 3, 2025
Copy link
Collaborator

@zhehuaichen zhehuaichen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice thanks!

Signed-off-by: Piotr Żelasko <[email protected]>
@github-actions
Copy link
Contributor

github-actions bot commented Jul 3, 2025

[🤖]: Hi @pzelasko 👋,

We wanted to let you know that a CICD pipeline for this PR just finished successfully.

So it might be time to merge this PR or get some approvals.

//cc @chtruong814 @ko3n1g @pablo-garay @thomasdhc

@pzelasko pzelasko enabled auto-merge (squash) July 3, 2025 17:12
@pzelasko pzelasko merged commit 5f921a0 into main Jul 3, 2025
128 checks passed
@pzelasko pzelasko deleted the speechlm2-skip-pretrained-asr branch July 3, 2025 17:15
AmirHussein96 pushed a commit to AmirHussein96/NeMo that referenced this pull request Jul 23, 2025
…4113)

* Skip loading pretrained ASR weights with released speechlm2 ckpts

Signed-off-by: Piotr Żelasko <[email protected]>

* Decrease max memory needed to load SALM model

Signed-off-by: Piotr Żelasko <[email protected]>

* Configurable dtype in SALM eval scripts

Signed-off-by: Piotr Żelasko <[email protected]>

* fix tests

Signed-off-by: Piotr Żelasko <[email protected]>

---------

Signed-off-by: Piotr Żelasko <[email protected]>
Signed-off-by: Amir Hussein <[email protected]>
AmirHussein96 pushed a commit to AmirHussein96/NeMo that referenced this pull request Aug 5, 2025
…4113)

* Skip loading pretrained ASR weights with released speechlm2 ckpts

Signed-off-by: Piotr Żelasko <[email protected]>

* Decrease max memory needed to load SALM model

Signed-off-by: Piotr Żelasko <[email protected]>

* Configurable dtype in SALM eval scripts

Signed-off-by: Piotr Żelasko <[email protected]>

* fix tests

Signed-off-by: Piotr Żelasko <[email protected]>

---------

Signed-off-by: Piotr Żelasko <[email protected]>
Signed-off-by: Amir Hussein <[email protected]>
AmirHussein96 pushed a commit to AmirHussein96/NeMo that referenced this pull request Aug 5, 2025
…4113)

* Skip loading pretrained ASR weights with released speechlm2 ckpts

Signed-off-by: Piotr Żelasko <[email protected]>

* Decrease max memory needed to load SALM model

Signed-off-by: Piotr Żelasko <[email protected]>

* Configurable dtype in SALM eval scripts

Signed-off-by: Piotr Żelasko <[email protected]>

* fix tests

Signed-off-by: Piotr Żelasko <[email protected]>

---------

Signed-off-by: Piotr Żelasko <[email protected]>
Signed-off-by: Amir Hussein <[email protected]>
nasretdinovr pushed a commit to nasretdinovr/NeMo that referenced this pull request Aug 8, 2025
…4113)

* Skip loading pretrained ASR weights with released speechlm2 ckpts

Signed-off-by: Piotr Żelasko <[email protected]>

* Decrease max memory needed to load SALM model

Signed-off-by: Piotr Żelasko <[email protected]>

* Configurable dtype in SALM eval scripts

Signed-off-by: Piotr Żelasko <[email protected]>

* fix tests

Signed-off-by: Piotr Żelasko <[email protected]>

---------

Signed-off-by: Piotr Żelasko <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants