Skip to content

Commit 52ee52b

Browse files
authored
Merge branch 'dev-0.x' into ViTPose
2 parents 4bb5213 + fd98b11 commit 52ee52b

File tree

20 files changed

+282
-180
lines changed

20 files changed

+282
-180
lines changed

.pre-commit-config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ repos:
5050
name: update-model-index
5151
description: Collect model information and update model-index.yml
5252
entry: .dev_scripts/github/update_model_index.py
53-
additional_dependencies: [mmcv]
53+
additional_dependencies: ['mmcv==1.7.1']
5454
language: python
5555
files: ^configs/.*\.md$
5656
require_serial: true

README.md

Lines changed: 64 additions & 64 deletions
Large diffs are not rendered by default.

README_CN.md

Lines changed: 84 additions & 84 deletions
Large diffs are not rendered by default.

configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_udp_coco.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,4 +60,4 @@ Results on COCO val2017 with detector having human AP of 56.4 on COCO val2017 da
6060
| [pose_hrnet_w48_udp](/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_384x288_udp.py) | 384x288 | 0.772 | 0.910 | 0.835 | 0.820 | 0.945 | [ckpt](https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w48_coco_384x288_udp-0f89c63e_20210223.pth) | [log](https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w48_coco_384x288_udp_20210223.log.json) |
6161
| [pose_hrnet_w32_udp_regress](/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_udp_regress.py) | 256x192 | 0.758 | 0.908 | 0.823 | 0.812 | 0.943 | [ckpt](https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w32_coco_256x192_udp_regress-be2dbba4_20210222.pth) | [log](https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w32_coco_256x192_udp_regress_20210222.log.json) |
6262

63-
Note that, UDP also adopts the unbiased encoding/decoding algorithm of [DARK](https://mmpose.readthedocs.io/en/latest/papers/techniques.html#div-align-center-darkpose-cvpr-2020-div).
63+
Note that, UDP also adopts the unbiased encoding/decoding algorithm of [DARK](https://mmpose.readthedocs.io/en/0.x/papers/techniques.html#div-align-center-darkpose-cvpr-2020-div).

demo/MMPose_Tutorial.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -463,7 +463,7 @@
463463
"\n",
464464
"### Add a new dataset\n",
465465
"\n",
466-
"There are two methods to support a customized dataset in MMPose. The first one is to convert the data to a supported format (e.g. COCO) and use the corresponding dataset class (e.g. TopdownCOCODataset), as described in the [document](https://mmpose.readthedocs.io/en/latest/tutorials/2_new_dataset.html#reorganize-dataset-to-existing-format). The second one is to add a new dataset class. In this tutorial, we give an example of the second method.\n",
466+
"There are two methods to support a customized dataset in MMPose. The first one is to convert the data to a supported format (e.g. COCO) and use the corresponding dataset class (e.g. TopdownCOCODataset), as described in the [document](https://mmpose.readthedocs.io/en/0.x/tutorials/2_new_dataset.html#reorganize-dataset-to-existing-format). The second one is to add a new dataset class. In this tutorial, we give an example of the second method.\n",
467467
"\n",
468468
"We first download the demo dataset, which contains 100 samples (75 for training and 25 for validation) selected from COCO train2017 dataset. The annotations are stored in a different format from the original COCO format.\n",
469469
"\n"
@@ -925,7 +925,7 @@
925925
"source": [
926926
"### Create a config file\n",
927927
"\n",
928-
"In the next step, we create a config file which configures the model, dataset and runtime settings. More information can be found at [Learn about Configs](https://mmpose.readthedocs.io/en/latest/tutorials/0_config.html). A common practice to create a config file is deriving from a existing one. In this tutorial, we load a config file that trains a HRNet on COCO dataset, and modify it to adapt to the COCOTiny dataset."
928+
"In the next step, we create a config file which configures the model, dataset and runtime settings. More information can be found at [Learn about Configs](https://mmpose.readthedocs.io/en/0.x/tutorials/0_config.html). A common practice to create a config file is deriving from a existing one. In this tutorial, we load a config file that trains a HRNet on COCO dataset, and modify it to adapt to the COCOTiny dataset."
929929
]
930930
},
931931
{

demo/docs/2d_animal_demo.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
We provide a demo script to test a single image, given gt json file.
88

99
*Pose Model Preparation:*
10-
The pre-trained pose estimation model can be downloaded from [model zoo](https://mmpose.readthedocs.io/en/latest/topics/animal.html).
10+
The pre-trained pose estimation model can be downloaded from [model zoo](https://mmpose.readthedocs.io/en/0.x/topics/animal.html).
1111
Take [macaque model](https://download.openmmlab.com/mmpose/animal/resnet/res50_macaque_256x192-98f1dd3a_20210407.pth) as an example:
1212

1313
```shell
@@ -113,7 +113,7 @@ python demo/top_down_video_demo_with_mmdet.py \
113113
**Other Animals**
114114

115115
For other animals, we have also provided some pre-trained animal detection models (1-class models). Supported models can be found in [det model zoo](/demo/docs/mmdet_modelzoo.md).
116-
The pre-trained animal pose estimation model can be found in [pose model zoo](https://mmpose.readthedocs.io/en/latest/topics/animal.html).
116+
The pre-trained animal pose estimation model can be found in [pose model zoo](https://mmpose.readthedocs.io/en/0.x/topics/animal.html).
117117

118118
```shell
119119
python demo/top_down_video_demo_with_mmdet.py \

demo/docs/2d_face_demo.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
We provide a demo script to test a single image, given gt json file.
1010

1111
*Face Keypoint Model Preparation:*
12-
The pre-trained face keypoint estimation model can be found from [model zoo](https://mmpose.readthedocs.io/en/latest/topics/face.html).
12+
The pre-trained face keypoint estimation model can be found from [model zoo](https://mmpose.readthedocs.io/en/0.x/topics/face.html).
1313
Take [aflw model](https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_aflw_256x256-f2bbc62b_20210125.pth) as an example:
1414

1515
```shell

demo/docs/2d_hand_demo.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
We provide a demo script to test a single image, given gt json file.
1010

1111
*Hand Pose Model Preparation:*
12-
The pre-trained hand pose estimation model can be downloaded from [model zoo](https://mmpose.readthedocs.io/en/latest/topics/hand%282d%2Ckpt%2Crgb%2Cimg%29.html).
12+
The pre-trained hand pose estimation model can be downloaded from [model zoo](https://mmpose.readthedocs.io/en/0.x/topics/hand%282d%2Ckpt%2Crgb%2Cimg%29.html).
1313
Take [onehand10k model](https://download.openmmlab.com/mmpose/top_down/resnet/res50_onehand10k_256x256-e67998f6_20200813.pth) as an example:
1414

1515
```shell
@@ -50,7 +50,7 @@ Assume that you have already installed [mmdet](https://github.com/open-mmlab/mmd
5050

5151
*Hand Box Model Preparation:* The pre-trained hand box estimation model can be found in [det model zoo](/demo/docs/mmdet_modelzoo.md).
5252

53-
*Hand Pose Model Preparation:* The pre-trained hand pose estimation model can be downloaded from [pose model zoo](https://mmpose.readthedocs.io/en/latest/topics/hand%282d%2Ckpt%2Crgb%2Cimg%29.html).
53+
*Hand Pose Model Preparation:* The pre-trained hand pose estimation model can be downloaded from [pose model zoo](https://mmpose.readthedocs.io/en/0.x/topics/hand%282d%2Ckpt%2Crgb%2Cimg%29.html).
5454

5555
```shell
5656
python demo/top_down_img_demo_with_mmdet.py \
@@ -80,7 +80,7 @@ Assume that you have already installed [mmdet](https://github.com/open-mmlab/mmd
8080

8181
*Hand Box Model Preparation:* The pre-trained hand box estimation model can be found in [det model zoo](/demo/docs/mmdet_modelzoo.md).
8282

83-
*Hand Pose Model Preparation:* The pre-trained hand pose estimation model can be found in [pose model zoo](https://mmpose.readthedocs.io/en/latest/topics/hand%282d%2Ckpt%2Crgb%2Cimg%29.html).
83+
*Hand Pose Model Preparation:* The pre-trained hand pose estimation model can be found in [pose model zoo](https://mmpose.readthedocs.io/en/0.x/topics/hand%282d%2Ckpt%2Crgb%2Cimg%29.html).
8484

8585
```shell
8686
python demo/top_down_video_demo_with_mmdet.py \

demo/docs/webcam_demo.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ Detailed configurations can be found in the config file.
6363
```
6464

6565
- **Configure pose estimation models**
66-
In this demo we use two [top-down](https://github.com/open-mmlab/mmpose/tree/master/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap) pose estimation models for humans and animals respectively. Users can choose models from the [MMPose Model Zoo](https://mmpose.readthedocs.io/en/latest/modelzoo.html). To apply different pose models on different instance types, you can add multiple pose estimator nodes with `cls_names` set accordingly.
66+
In this demo we use two [top-down](https://github.com/open-mmlab/mmpose/tree/master/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap) pose estimation models for humans and animals respectively. Users can choose models from the [MMPose Model Zoo](https://mmpose.readthedocs.io/en/0.x/modelzoo.html). To apply different pose models on different instance types, you can add multiple pose estimator nodes with `cls_names` set accordingly.
6767

6868
```python
6969
# 'TopDownPoseEstimatorNode':

docs/en/get_started.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -132,7 +132,7 @@ python demo/top_down_img_demo.py \
132132
--out-img-root vis_results
133133
```
134134

135-
More examples and details can be found in the [demo folder](/demo) and the [demo docs](https://mmpose.readthedocs.io/en/latest/demo.html).
135+
More examples and details can be found in the [demo folder](/demo) and the [demo docs](https://mmpose.readthedocs.io/en/0.x/demo.html).
136136

137137
## Train a model
138138

0 commit comments

Comments
 (0)