Skip to content
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/source/en/installation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -139,15 +139,15 @@ conda install -c huggingface transformers

## Cache setup

Pretrained models are downloaded and locally cached at: `~/.cache/huggingface/transformers/`. This is the default directory given by the shell environment variable `TRANSFORMERS_CACHE`. On Windows, the default directory is given by `C:\Users\username\.cache\huggingface\transformers`. You can change the shell environment variables shown below - in order of priority - to specify a different cache directory:
Pretrained models are downloaded and locally cached at: `~/.cache/huggingface/`. This is the default directory given by the shell environment variable `HUGGINGFACE_CACHE`. On Windows, the default directory is given by `C:\Users\username\.cache\huggingface`. You can change the shell environment variables shown below - in order of priority - to specify a different cache directory:

1. Shell environment variable (default): `TRANSFORMERS_CACHE`.
2. Shell environment variable: `HF_HOME` + `transformers/`.
3. Shell environment variable: `XDG_CACHE_HOME` + `/huggingface/transformers`.
1. Shell environment variable (default): `HUGGINGFACE_CACHE` or `TRANSFORMERS_CACHE`.
2. Shell environment variable: `HF_HOME`.
3. Shell environment variable: `XDG_CACHE_HOME` + `/huggingface`.

<Tip>

🤗 Transformers will use the shell environment variables `PYTORCH_TRANSFORMERS_CACHE` or `PYTORCH_PRETRAINED_BERT_CACHE` if you are coming from an earlier iteration of this library and have set those environment variables, unless you specify the shell environment variable `TRANSFORMERS_CACHE`.
🤗 Transformers will use the shell environment variables `PYTORCH_TRANSFORMERS_CACHE` or `PYTORCH_PRETRAINED_BERT_CACHE` if you are coming from an earlier iteration of this library and have set those environment variables, unless you specify the shell environment variable `HUGGINGFACE_CACHE`.

</Tip>

Expand Down
13 changes: 10 additions & 3 deletions src/transformers/utils/hub.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,8 @@ def is_offline_mode():
hf_cache_home = os.path.expanduser(
os.getenv("HF_HOME", os.path.join(os.getenv("XDG_CACHE_HOME", "~/.cache"), "huggingface"))
)
default_cache_path = os.path.join(hf_cache_home, "transformers")
hf_cache_home = os.getenv("HUGGINFACE_CACHE", hf_cache_home)
default_cache_path = hf_cache_home

# Onetime move from the old location to the new one if no ENV variable has been set.
if (
Expand Down Expand Up @@ -1475,9 +1476,15 @@ def move_to_new_cache(file, repo, filename, revision, etag, commit_hash):
clean_files_for(file)


def move_cache(cache_dir=None, token=None):
def move_cache(cache_dir=None, new_cache_dir=None, token=None):
if new_cache_dir is None:
new_cache_dir = TRANSFORMERS_CACHE
if cache_dir is None:
cache_dir = TRANSFORMERS_CACHE
# Migrate from old cache in .cache/huggingface/transformers
if os.path.isdir(os.path.join(TRANSFORMERS_CACHE, "transformers")):
cache_dir = os.path.join(TRANSFORMERS_CACHE, "transformers")
else:
cache_dir = new_cache_dir
if token is None:
token = HfFolder.get_token()
cached_files = get_all_cached_files(cache_dir=cache_dir)
Expand Down