Skip to content

Commit 9ee9014

Browse files
committed
replace download preparation with openmmlab urls
1 parent 43e39b3 commit 9ee9014

7 files changed

+21
-12
lines changed

projects/uniformer/README.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -27,9 +27,7 @@ export PYTHONPATH=`pwd`:$PYTHONPATH
2727

2828
2. Download Pretrained Weights
2929

30-
In order to either running inferences or training on the `uniformer pose estimation` project, you have to download the original Uniformer pretrained weights on ImageNet1k dataset and the weights trained for the downstream pose estimation task. The original ImageNet1k weights are hosted on SenseTime's [huggingface repository](https://huggingface.co/Sense-X/uniformer_image) and the pose estimation weights can be downloaded according to the [official README](https://github.com/Sense-X/UniFormer/tree/main/pose_estimation)
31-
32-
Once you have the weights downloaded, you can move them to the ideal places and update the corresponding paths in the [config files](./configs/) and the main [uniformer.py](./models/uniformer.py)
30+
To either run inferences or train on the `uniformer pose estimation` project, you have to download the original Uniformer pretrained weights on the ImageNet1k dataset and the weights trained for the downstream pose estimation task. The original ImageNet1k weights are hosted on SenseTime's [huggingface repository](https://huggingface.co/Sense-X/uniformer_image), and the downstream pose estimation task weights are hosted either on Google Drive or Baiduyun. We have uploaded them to the OpenMMLab download URLs, allowing users to use them without burden. For example, you can take a look at [`td-hm_uniformer-b-8xb128-210e_coco-256x192.py`](./configs/td-hm_uniformer-b-8xb128-210e_coco-256x192.py#62), the corresponding pretrained weight URL is already here and when the training or testing process starts, the weight will be automatically downloaded to your device. For the downstream task weights, you can get their URLs from the [benchmark result table](#results).
3331

3432
### Inference
3533

@@ -46,9 +44,9 @@ For more information on using the inferencer, please see [this document](https:/
4644
Here's an example code:
4745

4846
```shell
49-
python demo/inferencer_demo.py ../../tests/data/coco/000000000785.jpg \
50-
--pose2d ./projects/uniformer/configs/td-hm_uniformer-s-8xb128-210e_coco-256x192.py \
51-
--pose2d-weights $PATH_TO_YOUR_UNIFORMER_top_down_256x192_global_base.pth \
47+
python demo/inferencer_demo.py tests/data/coco/000000000785.jpg \
48+
--pose2d projects/uniformer/configs/td-hm_uniformer-s-8xb128-210e_coco-256x192.py \
49+
--pose2d-weights https://download.openmmlab.com/mmpose/v1/projects/uniformer/top_down_256x192_global_small-d4a7fdac_20230724.pth \
5250
--vis-out-dir vis_results
5351
```
5452

projects/uniformer/configs/td-hm_uniformer-b-8xb128-210e_coco-256x192.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,9 @@
6060
init_cfg=dict(
6161
# Set the path to pretrained backbone here
6262
type='Pretrained',
63-
checkpoint='${PATH_TO_YOUR_uniformer_small_in1k.pth}')),
63+
checkpoint='https://download.openmmlab.com/mmpose/v1/projects/'
64+
'uniformer/uniformer_base_in1k.pth' # noqa
65+
)),
6466
head=dict(
6567
type='HeatmapHead',
6668
in_channels=512,

projects/uniformer/configs/td-hm_uniformer-b-8xb32-210e_coco-384x288.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,9 @@
5959
init_cfg=dict(
6060
# Set the path to pretrained backbone here
6161
type='Pretrained',
62-
checkpoint='${PATH_TO_YOUR_uniformer_small_in1k.pth}')),
62+
checkpoint='https://download.openmmlab.com/mmpose/v1/projects/'
63+
'uniformer/uniformer_base_in1k.pth' # noqa
64+
)),
6365
head=dict(
6466
type='HeatmapHead',
6567
in_channels=512,

projects/uniformer/configs/td-hm_uniformer-b-8xb32-210e_coco-448x320.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,9 @@
5959
init_cfg=dict(
6060
# Set the path to pretrained backbone here
6161
type='Pretrained',
62-
checkpoint='${PATH_TO_YOUR_uniformer_small_in1k.pth}')),
62+
checkpoint='https://download.openmmlab.com/mmpose/v1/projects/'
63+
'uniformer/uniformer_base_in1k.pth' # noqa
64+
)),
6365
head=dict(
6466
type='HeatmapHead',
6567
in_channels=512,

projects/uniformer/configs/td-hm_uniformer-s-8xb128-210e_coco-256x192.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,9 @@
99
drop_path_rate=0.2,
1010
init_cfg=dict(
1111
type='Pretrained',
12-
checkpoint='${PATH_TO_YOUR_uniformer_small_in1k.pth}')))
12+
checkpoint='https://download.openmmlab.com/mmpose/v1/projects/'
13+
'uniformer/uniformer_small_in1k.pth' # noqa
14+
)))
1315

1416
train_dataloader = dict(batch_size=32)
1517
val_dataloader = dict(batch_size=256)

projects/uniformer/configs/td-hm_uniformer-s-8xb128-210e_coco-384x288.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,9 @@
1515
drop_path_rate=0.2,
1616
init_cfg=dict(
1717
type='Pretrained',
18-
checkpoint='${PATH_TO_YOUR_uniformer_small_in1k.pth}')))
18+
checkpoint='https://download.openmmlab.com/mmpose/v1/projects/'
19+
'uniformer/uniformer_small_in1k.pth' # noqa
20+
)))
1921

2022
train_dataloader = dict(batch_size=128)
2123
val_dataloader = dict(batch_size=256)

projects/uniformer/configs/td-hm_uniformer-s-8xb64-210e_coco-448x320.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,8 @@
1515
drop_path_rate=0.2,
1616
init_cfg=dict(
1717
type='Pretrained',
18-
checkpoint='${PATH_TO_YOUR_uniformer_small_in1k.pth}')))
18+
checkpoint='https://download.openmmlab.com/mmpose/v1/projects/'
19+
'uniformer/uniformer_small_in1k.pth')))
1920

2021
train_dataloader = dict(batch_size=64)
2122
val_dataloader = dict(batch_size=256)

0 commit comments

Comments
 (0)