Skip to content

Commit 3dec017

Browse files
committed
2.0.0-rc.3
1 parent 0a43482 commit 3dec017

File tree

9 files changed

+14
-14
lines changed

9 files changed

+14
-14
lines changed

Cargo.toml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@ exclude = [ 'examples/cudarc' ]
2323

2424
[package]
2525
name = "ort"
26-
description = "A safe Rust wrapper for ONNX Runtime 1.17 - Optimize and Accelerate Machine Learning Inferencing"
27-
version = "2.0.0-rc.2"
26+
description = "A safe Rust wrapper for ONNX Runtime 1.18 - Optimize and accelerate machine learning inference & training"
27+
version = "2.0.0-rc.3"
2828
edition = "2021"
2929
rust-version = "1.70"
3030
license = "MIT OR Apache-2.0"
@@ -83,7 +83,7 @@ qnn = [ "ort-sys/qnn" ]
8383
[dependencies]
8484
ndarray = { version = "0.15", optional = true }
8585
thiserror = "1.0"
86-
ort-sys = { version = "2.0.0-rc.2", path = "ort-sys" }
86+
ort-sys = { version = "2.0.0-rc.3", path = "ort-sys" }
8787
libloading = { version = "0.8", optional = true }
8888

8989
ureq = { version = "2.1", optional = true, default-features = false, features = [ "tls" ] }

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<a href="https://crates.io/crates/ort" target="_blank"><img alt="Crates.io" src="https://img.shields.io/crates/v/ort?style=for-the-badge&label=ort&logo=rust"></a> <img alt="ONNX Runtime" src="https://img.shields.io/badge/onnxruntime-v1.18.1-blue?style=for-the-badge&logo=cplusplus">
77
</div>
88

9-
`ort` is an (unofficial) [ONNX Runtime](https://onnxruntime.ai/) 1.18 wrapper for Rust based on the now inactive [`onnxruntime-rs`](https://github.com/nbigaouette/onnxruntime-rs). ONNX Runtime accelerates ML inference on both CPU & GPU.
9+
`ort` is an (unofficial) [ONNX Runtime](https://onnxruntime.ai/) 1.18 wrapper for Rust based on the now inactive [`onnxruntime-rs`](https://github.com/nbigaouette/onnxruntime-rs). ONNX Runtime accelerates ML inference and training on both CPU & GPU.
1010

1111
## 📖 Documentation
1212
- [Guide](https://ort.pyke.io/)

docs/pages/_meta.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
},
1111
"link-api": {
1212
"title": "API Reference ↗",
13-
"href": "https://docs.rs/ort/2.0.0-rc.2/ort"
13+
"href": "https://docs.rs/ort/2.0.0-rc.3/ort"
1414
},
1515
"link-crates": {
1616
"title": "Crates.io ↗",

docs/pages/index.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ import { Callout, Card, Cards, Steps } from 'nextra/components';
1111
</div>
1212

1313
<Callout type='warning'>
14-
These docs are for the latest alpha version of `ort`, `2.0.0-rc.2`. This version is production-ready (just not API stable) and we recommend new & existing projects use it.
14+
These docs are for the latest alpha version of `ort`, `2.0.0-rc.3`. This version is production-ready (just not API stable) and we recommend new & existing projects use it.
1515
</Callout>
1616

1717
`ort` makes it easy to deploy your machine learning models to production via [ONNX Runtime](https://onnxruntime.ai/), a hardware-accelerated inference engine. With `ort` + ONNX Runtime, you can run almost any ML model (including ResNet, YOLOv8, BERT, LLaMA) on almost any hardware, often far faster than PyTorch, and with the added bonus of Rust's efficiency.
@@ -37,7 +37,7 @@ Converting a neural network to a graph representation like ONNX opens the door t
3737
If you have a [supported platform](/setup/platforms) (and you probably do), installing `ort` couldn't be any simpler! Just add it to your Cargo dependencies:
3838
```toml
3939
[dependencies]
40-
ort = "2.0.0-rc.2"
40+
ort = "2.0.0-rc.3"
4141
```
4242

4343
### Convert your model

docs/pages/migrating/v2.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -173,7 +173,7 @@ let l = outputs["latents"].try_extract_tensor::<f32>()?;
173173
```
174174

175175
## Execution providers
176-
Execution provider structs with public fields have been replaced with builder pattern structs. See the [API reference](https://docs.rs/ort/2.0.0-rc.2/ort/index.html?search=ExecutionProvider) and the [execution providers reference](/perf/execution-providers) for more information.
176+
Execution provider structs with public fields have been replaced with builder pattern structs. See the [API reference](https://docs.rs/ort/2.0.0-rc.3/ort/index.html?search=ExecutionProvider) and the [execution providers reference](/perf/execution-providers) for more information.
177177

178178
```diff
179179
-// v1.x

docs/pages/perf/execution-providers.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ fn main() -> anyhow::Result<()> {
8383
```
8484

8585
## Configuring EPs
86-
EPs have configuration options to control behavior or increase performance. Each `XXXExecutionProvider` struct returns a builder with configuration methods. See the [API reference](https://docs.rs/ort/2.0.0-rc.2/ort/index.html?search=ExecutionProvider) for the EP structs for more information on which options are supported and what they do.
86+
EPs have configuration options to control behavior or increase performance. Each `XXXExecutionProvider` struct returns a builder with configuration methods. See the [API reference](https://docs.rs/ort/2.0.0-rc.3/ort/index.html?search=ExecutionProvider) for the EP structs for more information on which options are supported and what they do.
8787

8888
```rust
8989
use ort::{CoreMLExecutionProvider, Session};

docs/pages/setup/cargo-features.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ title: Cargo features
99
-**`half`**: Enables support for float16 & bfloat16 tensors via the [`half`](https://crates.io/crates/half) crate. ONNX models that are converted to 16-bit precision will typically convert to/from 32-bit floats at the input/output, so you will likely never actually need to interact with a 16-bit tensor on the Rust side. Though, `half` isn't a heavy enough crate to worry about it affecting compile times.
1010
-**`copy-dylibs`**: In case dynamic libraries are used (like with the CUDA execution provider), creates a symlink to them in the relevant places in the `target` folder to make [compile-time dynamic linking](/setup/linking#compile-time-dynamic-linking) work.
1111
- ⚒️ **`load-dynamic`**: Enables [runtime dynamic linking](/setup/linking#runtime-loading-with-load-dynamic), which alleviates many of the troubles with compile-time dynamic linking and offers greater flexibility.
12-
- ⚒️ **`fetch-models`**: Enables the [`SessionBuilder::commit_from_url`](https://docs.rs/ort/2.0.0-rc.2/ort/struct.SessionBuilder.html#method.commit_from_url) method, allowing you to quickly download & run a model from a URL. This should only be used for quick testing.
13-
- ⚒️ **`operator-libraries`**: Allows for sessions to load custom operators from dynamic C++ libraries via [`SessionBuilder::with_operator_library`](https://docs.rs/ort/2.0.0-rc.2/ort/struct.SessionBuilder.html#method.with_operator_library). If possible, we recommend [writing your custom ops in Rust instead](/perf/custom-operators).
12+
- ⚒️ **`fetch-models`**: Enables the [`SessionBuilder::commit_from_url`](https://docs.rs/ort/2.0.0-rc.3/ort/struct.SessionBuilder.html#method.commit_from_url) method, allowing you to quickly download & run a model from a URL. This should only be used for quick testing.
13+
- ⚒️ **`operator-libraries`**: Allows for sessions to load custom operators from dynamic C++ libraries via [`SessionBuilder::with_operator_library`](https://docs.rs/ort/2.0.0-rc.3/ort/struct.SessionBuilder.html#method.with_operator_library). If possible, we recommend [writing your custom ops in Rust instead](/perf/custom-operators).
1414

1515
## Execution providers
1616
Each [execution provider](/perf/execution-providers) is also gated behind a Cargo feature.

docs/pages/setup/platforms.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: ONNX Runtime, and by extension `ort`, supports a wide variety of pl
55

66
import { Callout } from 'nextra/components';
77

8-
Here are the supported platforms and binary availability status, as of v2.0.0-rc.2.
8+
Here are the supported platforms and binary availability status, as of v2.0.0-rc.3.
99

1010
* 🟢 - Supported. Dynamic & static binaries provided by pyke.
1111
* 🔷 - Supported. Static binaries provided by pyke.

ort-sys/Cargo.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
[package]
22
name = "ort-sys"
3-
description = "Unsafe Rust bindings for ONNX Runtime 1.17 - Optimize and Accelerate Machine Learning Inferencing"
4-
version = "2.0.0-rc.2"
3+
description = "Unsafe Rust bindings for ONNX Runtime 1.18 - Optimize and Accelerate Machine Learning Inferencing"
4+
version = "2.0.0-rc.3"
55
edition = "2021"
66
rust-version = "1.70"
77
license = "MIT OR Apache-2.0"

0 commit comments

Comments
 (0)