Hi Boltz team,
I’m seeing a model/checkpoint mismatch when loading Boltz2 with the current main code.
Environment
- OS: Colab (Linux)
- Python: 3.12
- PyTorch: 2.9.0+cu128
- PyTorch Lightning: 2.5.0
- Boltz code: latest
main (cloned from jwohlwend/boltz)
- Checkpoint:
~/.boltz/boltz2_conf.ckpt (downloaded by default flow)
- Size: ~2180.63 MB
state_dict keys: 5102
Main warning:
- Found keys that are in the model state dict but not in the checkpoint: ['pairformer_module.layers.0.attention.norm_s.weight', ... 'pairformer_module.layers.63.attention.norm_s.bias']
- Confirmed:
Question
Which code revision + checkpoint pair should be used for fully matched Boltz2 inference?
- Should users pin to a specific commit/tag?
- Is there an updated checkpoint that includes attention.norm_s weights?