Skip to content

Commit

Permalink
Merge pull request #74 from nbigaouette/apple-m1
Browse files Browse the repository at this point in the history
Add Apple M1 support
  • Loading branch information
nbigaouette authored Apr 11, 2021
2 parents 3b5fcd3 + 312952b commit 01bba59
Show file tree
Hide file tree
Showing 12 changed files with 9,729 additions and 22 deletions.
4 changes: 4 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"rust-analyzer.checkOnSave.command": "clippy",
"editor.formatOnSave": true
}
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added

- Add `String` datatype ([#58](https://github.com/nbigaouette/onnxruntime-rs/pull/58))
- Initial support for Apple M1 ([#74](https://github.com/nbigaouette/onnxruntime-rs/pull/74))

## [0.0.11] - 2021-02-22

Expand Down
6 changes: 4 additions & 2 deletions ONNX_Compilation_Notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,15 @@ brew install llvm cmake

# bindgen needs this to find llvm/clang
export LLVM_CONFIG_PATH=/usr/local/opt/llvm/bin/llvm-config
# Or on macOS Big Sur:
export LLVM_CONFIG_PATH=/opt/homebrew/opt/llvm/bin/llvm-config
```

The `build.rs` script uses the `ONNXRUNTIME_INSTALL_DIR` environment variable to
The `build.rs` script uses the `ORT_LIB_LOCATION` environment variable to
find the built library and its headers. Make sure to point to the proper location:

```sh
export ONNXRUNTIME_INSTALL_DIR=/full/path/to/onnxruntime
export ORT_LIB_LOCATION=/full/path/to/onnxruntime
```

**NOTE**: The [`.cargo/config`](.cargo/config) file assumes the library is installed
Expand Down
26 changes: 25 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ CPU:

* Linux x86_64
* macOS x86_64
* macOS aarch64 (no pre-built binaries, no CI testing, see [#74](https://github.com/nbigaouette/onnxruntime-rs/pull/74))
* Windows i686
* Windows x86_64

Expand Down Expand Up @@ -53,7 +54,7 @@ Three different strategy to obtain the ONNX Runtime are supported by the `build.
To select which strategy to use, set the `ORT_STRATEGY` environment variable to:

1. `download`: This is the default if `ORT_STRATEGY` is not set;
2. `system`: To use a locally installed version
2. `system`: To use a locally installed version (use `ORT_LIB_LOCATION` environment variable to point to the install path)
3. `compile`: To compile the library

The `download` strategy supports downloading a version of ONNX that supports CUDA. To use this, set the
Expand All @@ -62,6 +63,29 @@ environment variable `ORT_USE_CUDA=1` (only supports Linux or Windows).
Until the build script allow compilation of the runtime, see the [compilation notes](ONNX_Compilation_Notes.md)
for some details on the process.

### Note on 'ORT_STRATEGY=system'

When using `ORT_STRATEGY=system`, executing a built crate binary (for example the tests) might fail, at least on macOS,
if the library is not installed in a system path. An error similar to the following happens:

```text
dyld: Library not loaded: @rpath/libonnxruntime.1.7.1.dylib
Referenced from: onnxruntime-rs.git/target/debug/deps/onnxruntime_sys-22eb0e3e89a0278c
Reason: image not found
```

To fix, one can either:

* Set the `LD_LIBRARY_PATH` environment variable to point to the path where the library can be found.
* Adapt the `.cargo/config` file to contain a linker flag to provide the **full** path:

```toml
[target.aarch64-apple-darwin]
rustflags = ["-C", "link-args=-Wl,-rpath,/full/path/to/onnxruntime/lib"]
```

See [rust-lang/cargo #5077](https://github.com/rust-lang/cargo/issues/5077) for more information.

## Example

The C++ example that uses the C API
Expand Down
2 changes: 1 addition & 1 deletion onnxruntime-sys/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ keywords = ["neuralnetworks", "onnx", "bindings"]
[dependencies]

[build-dependencies]
bindgen = {version = "0.55", optional = true}
bindgen = { version = "0.55", optional = true }
ureq = "1.5.1"

# Used on Windows
Expand Down
16 changes: 13 additions & 3 deletions onnxruntime-sys/build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -61,12 +61,22 @@ fn main() {
#[cfg(not(feature = "generate-bindings"))]
fn generate_bindings(_include_dir: &Path) {
println!("Bindings not generated automatically, using committed files instead.");
println!("Enable with the 'bindgen' cargo feature.");
println!("Enable with the 'generate-bindings' cargo feature.");
}

#[cfg(feature = "generate-bindings")]
fn generate_bindings(include_dir: &Path) {
let clang_arg = format!("-I{}", include_dir.display());
let clang_args = &[
format!("-I{}", include_dir.display()),
format!(
"-I{}",
include_dir
.join("onnxruntime")
.join("core")
.join("session")
.display()
),
];

// Tell cargo to invalidate the built crate whenever the wrapper changes
println!("cargo:rerun-if-changed=wrapper.h");
Expand All @@ -80,7 +90,7 @@ fn generate_bindings(include_dir: &Path) {
// bindings for.
.header("wrapper.h")
// The current working directory is 'onnxruntime-sys'
.clang_arg(clang_arg)
.clang_args(clang_args)
// Tell cargo to invalidate the built crate whenever any of the
// included header files changed.
.parse_callbacks(Box::new(bindgen::CargoCallbacks))
Expand Down
6 changes: 6 additions & 0 deletions onnxruntime-sys/src/generated/bindings.rs
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,12 @@ include!(concat!(
"/src/generated/macos/x86_64/bindings.rs"
));

#[cfg(all(target_os = "macos", target_arch = "aarch64"))]
include!(concat!(
env!("CARGO_MANIFEST_DIR"),
"/src/generated/macos/aarch64/bindings.rs"
));

#[cfg(all(target_os = "windows", target_arch = "x86"))]
include!(concat!(
env!("CARGO_MANIFEST_DIR"),
Expand Down
Loading

0 comments on commit 01bba59

Please sign in to comment.