You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 14, 2025. It is now read-only.
* deprecated in favor of MLM extension
The *Machine Learning Model (MLM)* extension (https://github.com/crim-ca/mlm-extension) combines the fields that were previously defined in the *Deep Learning Model (DLM)* extension as well as most (all?) fields proposed by ML-Model. Some fields are renamed to avoid redundant details between the 2 references, while others are adjusted to allow more flexibility (e.g.: not just docker-compose runtime, but virtually anything). More best-practices and examples are provided to demonstrate the use of MLM along other STAC extensions to take advantage of the full STAC ecosystem.
Schema for MLM: https://crim-ca.github.io/mlm-extension/v1.0.0/schema.json
* draft migration path doc from ml-model to MLM
* Update README with deprecation and MLM redirect
See https://github.com/stac-extensions/mlm and https://github.com/orgs/stac-utils/discussions/4#discussioncomment-10767685
* address feedback
* address remaining feedback
* correct links
* update migration references
* update & fix markdown
---------
Co-authored-by: Ryan Avery <[email protected]>
# Migration Guide: ML Model Extension to MLM Extension
2
+
3
+
<!-- lint disable no-undefined-references -->
4
+
5
+
>[!IMPORTANT]
6
+
> For specific field migration details from [ML-Model](README.md) to [Machine Learning Model (MLM)][mlm] please refer
7
+
> to the [MLM Migration Document](https://github.com/stac-extensions/mlm/blob/main/docs/legacy/ml-model.md).
8
+
9
+
<!-- lint enable no-undefined-references -->
10
+
11
+
## Context
12
+
13
+
The ML Model Extension was started at Radiant Earth on October 4th, 2021.
14
+
It was possibly the first STAC extension dedicated to describing machine learning models.
15
+
The extension incorporated inputs from 9 different organizations and was used to describe models
16
+
in Radiant Earth's MLHub API. The announcement of this extension and its use in Radiant Earth's MLHub
17
+
is described [here](https://medium.com/radiant-earth-insights/geospatial-models-now-available-in-radiant-mlhub-a41eb795d7d7).
18
+
Radiant Earth's MLHub API and Python SDK are now [deprecated](https://mlhub.earth/?gad_source=1&gclid=CjwKCAjwk8e1BhALEiwAc8MHiBZ1JcpErgQXlna7FsB3dd-mlPpMF-jpLQJolBgtYLDOeH2k-cxxLRoCEqQQAvD_BwE).
19
+
In order to support other current users of the ML Model extension, this document lays out a migration path to convert
20
+
metadata to the [Machine Learning Model Extension (MLM)][mlm].
21
+
22
+
## Shared Goals
23
+
24
+
Both the ML Model Extension and the [Machine Learning Model (MLM)][mlm] extension aim to provide a standard way to
25
+
catalog machine learning (ML) models that work with, but are not limited to, Earth observation (EO) data.
26
+
27
+
Their main goals are:
28
+
29
+
1.**Search and Discovery**: Helping users find and use ML models.
30
+
2.**Describing Inference and Training Requirements**: Making it easier to run these models by describing input requirements and outputs.
31
+
3.**Reproducibility**: Providing runtime information and links to assets so that model inference is reproducible.
32
+
33
+
## Schema Changes
34
+
35
+
### ML Model Extension
36
+
-**Scope**: Item, Collection
37
+
-**Field Name Prefix**: `ml-model`
38
+
-**Key Sections**:
39
+
- Item Properties
40
+
- Asset Objects
41
+
- Inference/Training Runtimes
42
+
- Relation Types
43
+
- Interpretation of STAC Fields
44
+
45
+
### MLM Extension
46
+
-**Scope**: Collection, Item, Asset, Links
47
+
-**Field Name Prefix**: `mlm`
48
+
-**Key Sections**:
49
+
- Item Properties and Collection Fields
50
+
- Asset Objects
51
+
- Relation Types
52
+
- Model Input/Output Objects
53
+
- Best Practices
54
+
55
+
### Notable Differences
56
+
57
+
- The MLM Extension covers more details at both the Item and Asset levels, making it easier to describe and use model metadata.
58
+
- The MLM Extension covers more runtime requirements using distinct asset roles.
59
+
- The MLM extension has better integration with the STAC Extensions and Python ecosystem.
60
+
61
+
## Getting Help
62
+
63
+
If you have any questions about a migration, feel free to contact the maintainers by opening a discussion or issue
64
+
on the [MLM repository][mlm].
65
+
66
+
If you see a feature missing in the MLM, feel free to open an issue describing your feature request.
-[Item example](examples/dummy/item.json): Shows the basic usage of the extension in a STAC Item
@@ -49,7 +69,7 @@ these models for the following types of use-cases:
49
69
institutions are making an effort to publish code and examples along with academic publications to enable this kind of reproducibility. However,
50
70
the quality and usability of this code and related documentation can vary widely and there are currently no standards that ensure that a new
51
71
researcher could reproduce a given set of published results from the documentation. The STAC ML Model Extension aims to address this issue by
52
-
providing a detailed description of the training data and environment used in a ML model experiment.
72
+
providing a detailed description of the training data and environment used in a ML model experiment.
53
73
54
74
## Item Properties
55
75
@@ -66,7 +86,7 @@ these models for the following types of use-cases:
66
86
67
87
#### ml-model:learning_approach
68
88
69
-
Describes the learning approach used to train the model. It is STRONGLY RECOMMENDED that you use one of the
89
+
Describes the learning approach used to train the model. It is STRONGLY RECOMMENDED that you use one of the
70
90
following values, but other values are allowed.
71
91
72
92
-`"supervised"`
@@ -76,7 +96,7 @@ following values, but other values are allowed.
76
96
77
97
#### ml-model:prediction_type
78
98
79
-
Describes the type of predictions made by the model. It is STRONGLY RECOMMENDED that you use one of the
99
+
Describes the type of predictions made by the model. It is STRONGLY RECOMMENDED that you use one of the
80
100
following values, but other values are allowed. Note that not all Prediction Type values are valid
81
101
for a given [Learning Approach](#ml-modellearning_approach).
82
102
@@ -120,7 +140,7 @@ While the Compose file defines nearly all of the parameters required to run the
120
140
directory containing input data should be mounted to the container and to which host directory the output predictions should be written. The Compose
121
141
file MUST define volume mounts for input and output data using the Compose
122
142
[Interpolation syntax](https://github.com/compose-spec/compose-spec/blob/master/spec.md#interpolation). The input data volume MUST be defined by an
123
-
`INPUT_DATA` variable and the output data volume MUST be defined by an `OUTPUT_DATA` variable.
143
+
`INPUT_DATA` variable and the output data volume MUST be defined by an `OUTPUT_DATA` variable.
124
144
125
145
For example, the following Compose file snippet would mount the host input directory to `/var/data/input` in the container and would mount the host
126
146
output data directory to `/var/data/output` in the host container. In this contrived example, the script to run the model takes 2 arguments: the
@@ -208,10 +228,10 @@ extension, please open a PR to include it in the `examples` directory. Here are
208
228
209
229
### Running tests
210
230
211
-
The same checks that run as checks on PR's are part of the repository and can be run locally to verify that changes are valid.
231
+
The same checks that run as checks on PR's are part of the repository and can be run locally to verify that changes are valid.
212
232
To run tests locally, you'll need `npm`, which is a standard part of any [node.js installation](https://nodejs.org/en/download/).
213
233
214
-
First you'll need to install everything with npm once. Just navigate to the root of this repository and on
234
+
First you'll need to install everything with npm once. Just navigate to the root of this repository and on
0 commit comments