-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Open
Description
I tried to use export_meta_llama_hf.bin.py. The first error I encountered was:
FileNotFoundError: [Errno 2] No such file or directory: '../Llama-2-13b-chat-hf/params.json'
But that's goes away after I rename the Llama-2-13b-chat-hf/config.json to Llama-2-13b-chat-hf/params.json.
However, the next error hits, which is:
Traceback (most recent call last):
File "/root/llama2.c/export_meta_llama_hf_bin.py", line 117, in <module>
load_and_export(model_path, output_path)
File "/root/llama2.c/export_meta_llama_hf_bin.py", line 105, in load_and_export
state_dict = concat_weights(models)
File "/root/llama2.c/export_meta_llama_hf_bin.py", line 78, in concat_weights
for name in list(models[0]):
IndexError: list index out of rangeOn closer inspection,models originate from
llama2.c/export_meta_llama_bin.py
Line 98 in bd18228
| model_paths = sorted(list(Path(model_path).glob('consolidated.*.pth'))) |
and it calls .glob to get .pth like this:model_paths = sorted(list(Path(model_path).glob('consolidated.*.pth')))
But there are no .pth files in the huggingface models.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels