Skip to content

Checkpoint ~/.boltz/boltz2_conf.ckpt missing pairformer_module.layers.*.attention.norm_s.* keys vs latest main #644

@Peter-obi

Description

@Peter-obi

Hi Boltz team,

I’m seeing a model/checkpoint mismatch when loading Boltz2 with the current main code.

Environment

  • OS: Colab (Linux)
  • Python: 3.12
  • PyTorch: 2.9.0+cu128
  • PyTorch Lightning: 2.5.0
  • Boltz code: latest main (cloned from jwohlwend/boltz)
  • Checkpoint: ~/.boltz/boltz2_conf.ckpt (downloaded by default flow)
  • Size: ~2180.63 MB
  • state_dict keys: 5102

Main warning:

  • Found keys that are in the model state dict but not in the checkpoint: ['pairformer_module.layers.0.attention.norm_s.weight', ... 'pairformer_module.layers.63.attention.norm_s.bias']
  • Confirmed:
    • norm_s keys in ckpt: 0

Question

Which code revision + checkpoint pair should be used for fully matched Boltz2 inference?

  • Should users pin to a specific commit/tag?
  • Is there an updated checkpoint that includes attention.norm_s weights?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions