Skip to content

Error in converting huggingface models #314

@calvintwr

Description

@calvintwr

I tried to use export_meta_llama_hf.bin.py. The first error I encountered was:

FileNotFoundError: [Errno 2] No such file or directory: '../Llama-2-13b-chat-hf/params.json'

But that's goes away after I rename the Llama-2-13b-chat-hf/config.json to Llama-2-13b-chat-hf/params.json.

However, the next error hits, which is:

Traceback (most recent call last):
  File "/root/llama2.c/export_meta_llama_hf_bin.py", line 117, in <module>
    load_and_export(model_path, output_path)
  File "/root/llama2.c/export_meta_llama_hf_bin.py", line 105, in load_and_export
    state_dict = concat_weights(models)
  File "/root/llama2.c/export_meta_llama_hf_bin.py", line 78, in concat_weights
    for name in list(models[0]):
IndexError: list index out of range

On closer inspection,models originate from

model_paths = sorted(list(Path(model_path).glob('consolidated.*.pth')))

and it calls .glob to get .pth like this:model_paths = sorted(list(Path(model_path).glob('consolidated.*.pth')))

But there are no .pth files in the huggingface models.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions