Skip to content

Commit fc570d3

Browse files
committed
Merge branch 'main' into theme_play
* main: (71 commits) Skip TestConstrainedLoad if data missing (SciTools#4319) Add 'Good First Issue' label to reasons an issue doesn't go stale (SciTools#4317) Gallery: simplify quiver example (SciTools#4120) Improve styling in a minor way in docs (SciTools#4314) bump version (SciTools#4310) Made clear we only test on Linux. (SciTools#4309) Updated environment lockfiles (SciTools#4308) Include Discussions in Getting Involved. (SciTools#4307) Fixed text to show as link. (SciTools#4305) [pre-commit.ci] pre-commit autoupdate (SciTools#4299) Updated environment lockfiles (SciTools#4298) cartopy feature download (SciTools#4304) Mesh Loading (AVD-1813) (SciTools#4262) reset whatsnew latest (SciTools#4288) Updated environment lockfiles (SciTools#4289) Update cube.py (SciTools#4287) Integrated whatsnew for v3.1 release (rc0) (SciTools#4285) Version changes and final whatsnew tweaks for 3v1rc0. (SciTools#4284) Missing whatsnew entries for 3.1 release. (SciTools#4283) Update CF standard name table to v77 (SciTools#4282) ...
2 parents 70e5901 + 06de638 commit fc570d3

176 files changed

Lines changed: 20421 additions & 4511 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.cirrus.yml

Lines changed: 14 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ env:
2626
# Maximum cache period (in weeks) before forcing a new cache upload.
2727
CACHE_PERIOD: "2"
2828
# Increment the build number to force new cartopy cache upload.
29-
CARTOPY_CACHE_BUILD: "0"
29+
CARTOPY_CACHE_BUILD: "3"
3030
# Increment the build number to force new conda cache upload.
3131
CONDA_CACHE_BUILD: "0"
3232
# Increment the build number to force new nox cache upload.
@@ -38,7 +38,7 @@ env:
3838
# Conda packages to be installed.
3939
CONDA_CACHE_PACKAGES: "nox pip"
4040
# Git commit hash for iris test data.
41-
IRIS_TEST_DATA_VERSION: "2.0.0"
41+
IRIS_TEST_DATA_VERSION: "2.4"
4242
# Base directory for the iris-test-data.
4343
IRIS_TEST_DATA_DIR: ${HOME}/iris-test-data
4444

@@ -72,6 +72,14 @@ linux_task_template: &LINUX_TASK_TEMPLATE
7272
fingerprint_script:
7373
- echo "${CIRRUS_OS}"
7474
- echo "$(date +%Y).$(expr $(date +%U) / ${CACHE_PERIOD}):${CARTOPY_CACHE_BUILD}"
75+
populate_script:
76+
- conda create --quiet --name cartopy-cache cartopy
77+
- source ${HOME}/miniconda/etc/profile.d/conda.sh >/dev/null 2>&1
78+
- conda activate cartopy-cache >/dev/null 2>&1
79+
- cd $(mktemp -d)
80+
- wget --quiet https://raw.githubusercontent.com/SciTools/cartopy/master/tools/cartopy_feature_download.py
81+
- python cartopy_feature_download.py physical --output ${HOME}/.local/share/cartopy --no-warn
82+
- conda deactivate >/dev/null 2>&1
7583
nox_cache:
7684
folder: ${CIRRUS_WORKING_DIR}/.nox
7785
reupload_on_changes: true
@@ -105,7 +113,7 @@ iris_test_data_template: &IRIS_TEST_DATA_TEMPLATE
105113
#
106114
# Linting
107115
#
108-
precommit_task:
116+
task:
109117
only_if: ${SKIP_LINT_TASK} == ""
110118
<< : *CREDITS_TEMPLATE
111119
auto_cancellation: true
@@ -129,7 +137,7 @@ precommit_task:
129137
#
130138
# Testing (Linux)
131139
#
132-
test_task:
140+
task:
133141
only_if: ${SKIP_TEST_TASK} == ""
134142
<< : *CREDITS_TEMPLATE
135143
matrix:
@@ -155,7 +163,7 @@ test_task:
155163
#
156164
# Documentation Testing and Gallery (Linux)
157165
#
158-
doctest_task:
166+
task:
159167
only_if: ${SKIP_DOCTEST_TASK} == "" && ${SKIP_ALL_DOC_TASKS} == ""
160168
<< : *CREDITS_TEMPLATE
161169
env:
@@ -179,7 +187,7 @@ doctest_task:
179187
#
180188
# Documentation Link Check (Linux)
181189
#
182-
linkcheck_task:
190+
task:
183191
only_if: ${SKIP_LINKCHECK_TASK} == "" && ${SKIP_ALL_DOC_TASKS} == ""
184192
<< : *CREDITS_TEMPLATE
185193
env:

.github/workflows/benchmark.yml

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
# This is a basic workflow to help you get started with Actions
2+
3+
name: benchmark-check
4+
5+
on:
6+
# Triggers the workflow on push or pull request events but only for the master branch
7+
pull_request:
8+
9+
jobs:
10+
benchmark:
11+
runs-on: ubuntu-latest
12+
13+
steps:
14+
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
15+
- uses: actions/checkout@v2
16+
17+
- name: Checkout the branches
18+
run: |
19+
git fetch --depth=1 origin ${{ github.event.pull_request.base.ref }}
20+
git branch _base FETCH_HEAD
21+
git fetch --depth=1 origin ${{ github.ref }}
22+
git branch _head FETCH_HEAD
23+
24+
- name: Setup asv
25+
run: |
26+
pip install asv
27+
cd benchmarks
28+
asv machine --yes
29+
30+
- name: Run benchmarks on source and target
31+
run: |
32+
cd benchmarks
33+
asv continuous --factor 1.2 _base _head
34+
35+
- name: Write a compare file to the output folder
36+
if: ${{ always() }}
37+
run: |
38+
cd benchmarks
39+
asv compare -s _base _head > .asv/compare.txt
40+
41+
- name: Archive asv results
42+
if: ${{ always() }}
43+
uses: actions/upload-artifact@v2
44+
with:
45+
name: asv-report
46+
path: |
47+
benchmarks/.asv/results
48+
benchmarks/.asv/compare.txt

.github/workflows/stale.yml

Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,78 @@
1+
# See https://github.com/actions/stale
2+
3+
name: Stale issues and pull-requests
4+
on:
5+
schedule:
6+
- cron: 0 0 * * *
7+
8+
jobs:
9+
stale:
10+
runs-on: ubuntu-latest
11+
steps:
12+
- uses: actions/[email protected]
13+
with:
14+
repo-token: ${{ secrets.GITHUB_TOKEN }}
15+
16+
# Idle number of days before marking issues/prs stale.
17+
days-before-stale: 500
18+
19+
# Idle number of days before closing stale issues/prs.
20+
days-before-close: 28
21+
22+
# Comment on the staled issues.
23+
stale-issue-message: |
24+
In order to maintain a backlog of relevant issues, we automatically label them as stale after 500 days of inactivity.
25+
26+
If this issue is still important to you, then please comment on this issue and the stale label will be removed.
27+
28+
Otherwise this issue will be automatically closed in 28 days time.
29+
30+
# Comment on the staled prs.
31+
stale-pr-message: |
32+
In order to maintain a backlog of relevant PRs, we automatically label them as stale after 500 days of inactivity.
33+
34+
If this PR is still important to you, then please comment on this PR and the stale label will be removed.
35+
36+
Otherwise this PR will be automatically closed in 28 days time.
37+
38+
# Comment on the staled issues while closed.
39+
close-issue-message: |
40+
This stale issue has been automatically closed due to a lack of community activity.
41+
42+
If you still care about this issue, then please either:
43+
* Re-open this issue, if you have sufficient permissions, or
44+
* Add a comment pinging `@SciTools/iris-devs` who will re-open on your behalf.
45+
46+
# Comment on the staled prs while closed.
47+
close-pr-message: |
48+
This stale PR has been automatically closed due to a lack of community activity.
49+
50+
If you still care about this PR, then please either:
51+
* Re-open this PR, if you have sufficient permissions, or
52+
* Add a comment pinging `@SciTools/iris-devs` who will re-open on your behalf.
53+
54+
# Label to apply on staled issues.
55+
stale-issue-label: Stale
56+
57+
# Label to apply on staled prs.
58+
stale-pr-label: Stale
59+
60+
# Labels on issues exempted from stale.
61+
exempt-issue-labels: |
62+
"Status: Blocked,Status: Decision Required,Peloton 🚴‍♂️,Good First Issue"
63+
64+
# Labels on prs exempted from stale.
65+
exempt-pr-labels: |
66+
"Status: Blocked,Status: Decision Required,Peloton 🚴‍♂️,Good First Issue"
67+
68+
# Max number of operations per run.
69+
operations-per-run: 300
70+
71+
# Remove stale label from issues/prs on updates/comments.
72+
remove-stale-when-updated: true
73+
74+
# Order to get issues/PRs.
75+
ascending: true
76+
77+
# Exempt all issues/prs with milestones from stale.
78+
exempt-all-milestones: true

.pre-commit-config.yaml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ repos:
2828
- id: no-commit-to-branch
2929

3030
- repo: https://github.com/psf/black
31-
rev: 21.7b0
31+
rev: 21.8b0
3232
hooks:
3333
- id: black
3434
pass_filenames: false
@@ -42,14 +42,14 @@ repos:
4242
args: [--config=./setup.cfg]
4343

4444
- repo: https://github.com/pycqa/isort
45-
rev: 5.9.2
45+
rev: 5.9.3
4646
hooks:
4747
- id: isort
4848
types: [file, python]
4949
args: [--filter-files]
5050

5151
- repo: https://github.com/asottile/blacken-docs
52-
rev: v1.10.0
52+
rev: v1.11.0
5353
hooks:
5454
- id: blacken-docs
5555
types: [file, rst]

MANIFEST.in

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ include CHANGES COPYING COPYING.LESSER
44
# Files from setup.py package_data that are not automatically added to source distributions
55
recursive-include lib/iris/tests/results *.cml *.cdl *.txt *.xml *.json
66
recursive-include lib/iris/etc *
7+
include lib/iris/tests/stock/file_headers/*
78

89
recursive-include requirements *
910

asv.conf.json

Lines changed: 0 additions & 36 deletions
This file was deleted.

benchmarks/asv.conf.json

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
{
2+
"version": 1,
3+
"project": "scitools-iris",
4+
"project_url": "https://github.com/SciTools/iris",
5+
"repo": "..",
6+
"environment_type": "conda-lock",
7+
"show_commit_url": "http://github.com/scitools/iris/commit/",
8+
9+
"benchmark_dir": "./benchmarks",
10+
"env_dir": ".asv/env",
11+
"results_dir": ".asv/results",
12+
"html_dir": ".asv/html",
13+
"plugins": [".conda_lock_plugin"],
14+
// this is not an asv standard config entry, just for our plugin
15+
// path to lockfile, relative to project base
16+
"conda_lockfile": "requirements/ci/nox.lock/py38-linux-64.lock"
17+
}

benchmarks/benchmarks/__init__.py

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
# Copyright Iris contributors
2+
#
3+
# This file is part of Iris and is released under the LGPL license.
4+
# See COPYING and COPYING.LESSER in the root of the repository for full
5+
# licensing details.
6+
"""Common code for benchmarks."""
7+
8+
import os
9+
from pathlib import Path
10+
11+
# Environment variable names
12+
_ASVDIR_VARNAME = "ASV_DIR" # As set in nightly script "asv_nightly/asv.sh"
13+
_DATADIR_VARNAME = "BENCHMARK_DATA" # For local runs
14+
15+
ARTIFICIAL_DIM_SIZE = int(10e3) # For all artificial cubes, coords etc.
16+
17+
# Work out where the benchmark data dir is.
18+
asv_dir = os.environ.get("ASV_DIR", None)
19+
if asv_dir:
20+
# For an overnight run, this comes from the 'ASV_DIR' setting.
21+
benchmark_data_dir = Path(asv_dir) / "data"
22+
else:
23+
# For a local run, you set 'BENCHMARK_DATA'.
24+
benchmark_data_dir = os.environ.get(_DATADIR_VARNAME, None)
25+
if benchmark_data_dir is not None:
26+
benchmark_data_dir = Path(benchmark_data_dir)
27+
28+
29+
def testdata_path(*path_names):
30+
"""
31+
Return the path of a benchmark test data file.
32+
33+
These are based from a test-data location dir, which is either
34+
${}/data (for overnight tests), or ${} for local testing.
35+
36+
If neither of these were set, an error is raised.
37+
38+
""".format(
39+
_ASVDIR_VARNAME, _DATADIR_VARNAME
40+
)
41+
if benchmark_data_dir is None:
42+
msg = (
43+
"Benchmark data dir is not defined : "
44+
'Either "${}" or "${}" must be set.'
45+
)
46+
raise (ValueError(msg.format(_ASVDIR_VARNAME, _DATADIR_VARNAME)))
47+
path = benchmark_data_dir.joinpath(*path_names)
48+
path = str(path) # Because Iris doesn't understand Path objects yet.
49+
return path
Lines changed: 59 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
# Copyright Iris contributors
2+
#
3+
# This file is part of Iris and is released under the LGPL license.
4+
# See COPYING and COPYING.LESSER in the root of the repository for full
5+
# licensing details.
6+
"""
7+
AuxFactory benchmark tests.
8+
9+
"""
10+
11+
import numpy as np
12+
13+
from benchmarks import ARTIFICIAL_DIM_SIZE
14+
from iris import aux_factory, coords
15+
16+
17+
class FactoryCommon:
18+
# TODO: once https://github.com/airspeed-velocity/asv/pull/828 is released:
19+
# * make class an ABC
20+
# * remove NotImplementedError
21+
# * combine setup_common into setup
22+
23+
"""
24+
A base class running a generalised suite of benchmarks for any factory.
25+
Factory to be specified in a subclass.
26+
27+
ASV will run the benchmarks within this class for any subclasses.
28+
29+
Should only be instantiated within subclasses, but cannot enforce this
30+
since ASV cannot handle classes that include abstract methods.
31+
"""
32+
33+
def setup(self):
34+
"""Prevent ASV instantiating (must therefore override setup() in any subclasses.)"""
35+
raise NotImplementedError
36+
37+
def setup_common(self):
38+
"""Shared setup code that can be called by subclasses."""
39+
self.factory = self.create()
40+
41+
def time_create(self):
42+
"""Create an instance of the benchmarked factory. create method is
43+
specified in the subclass."""
44+
self.create()
45+
46+
def time_return(self):
47+
"""Return an instance of the benchmarked factory."""
48+
self.factory
49+
50+
51+
class HybridHeightFactory(FactoryCommon):
52+
def setup(self):
53+
data_1d = np.zeros(ARTIFICIAL_DIM_SIZE)
54+
self.coord = coords.AuxCoord(points=data_1d, units="m")
55+
56+
self.setup_common()
57+
58+
def create(self):
59+
return aux_factory.HybridHeightFactory(delta=self.coord)

0 commit comments

Comments
 (0)