-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Port ONNX op test suite. #8
Merged
Merged
Changes from all commits
Commits
Show all changes
3 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
The diff you're trying to view is too large. We only load the first 3000 changed files.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,73 @@ | ||
# Copyright 2024 The IREE Authors | ||
# | ||
# Licensed under the Apache License v2.0 with LLVM Exceptions. | ||
# See https://llvm.org/LICENSE.txt for license information. | ||
# SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception | ||
|
||
name: Test ONNX Ops | ||
on: | ||
pull_request: | ||
paths: | ||
# This file itself. | ||
- ".github/workflows/test_onnx_ops.yml" | ||
- "onnx-ops/**" | ||
workflow_dispatch: | ||
schedule: | ||
# Runs at 3:00 PM UTC, which is 8:00 AM PST | ||
- cron: "0 15 * * *" | ||
|
||
concurrency: | ||
# A PR number if a pull request and otherwise the commit hash. This cancels | ||
# queued and in-progress runs for the same PR (presubmit) or commit | ||
# (postsubmit). The workflow name is prepended to avoid conflicts between | ||
# different workflows. | ||
group: ${{ github.workflow }}-${{ github.event.number || github.sha }} | ||
cancel-in-progress: true | ||
|
||
jobs: | ||
test-onnx-ops: | ||
runs-on: ubuntu-24.04 | ||
env: | ||
VENV_DIR: ${{ github.workspace }}/.venv | ||
CONFIG_FILE_PATH: onnx-ops/configs/onnx_ops_cpu_llvm_sync.json | ||
steps: | ||
- name: "Checking out repository" | ||
uses: actions/checkout@v4 | ||
|
||
# Install Python packages. | ||
- name: "Setting up Python" | ||
uses: actions/setup-python@v5 | ||
with: | ||
python-version: "3.11" | ||
- name: "Setup Python venv" | ||
run: python3 -m venv ${VENV_DIR} | ||
- name: "Installing IREE nightly release Python packages" | ||
run: | | ||
source ${VENV_DIR}/bin/activate | ||
python3 -m pip install -r onnx-ops/requirements-iree.txt | ||
|
||
# Run tests and output new config files as needed. | ||
- name: "Running the ONNX ops test suite" | ||
run: | | ||
source ${VENV_DIR}/bin/activate | ||
pytest onnx-ops/ \ | ||
-n auto \ | ||
-rA \ | ||
--timeout=30 \ | ||
--durations=10 \ | ||
--report-log=/tmp/onnx_ops_cpu_logs.json \ | ||
--config-files=${CONFIG_FILE_PATH} | ||
- name: "Updating config file with latest XFAIL lists" | ||
if: failure() | ||
run: | | ||
source ${VENV_DIR}/bin/activate | ||
python onnx-ops/update_config_xfails.py \ | ||
--log-file=/tmp/onnx_ops_cpu_logs.json \ | ||
--config-file=${CONFIG_FILE_PATH} | ||
cat ${CONFIG_FILE_PATH} | ||
- name: "Uploading new config file" | ||
if: failure() | ||
uses: actions/upload-artifact@v4 | ||
with: | ||
name: onnx_ops_cpu_llvm_sync.json | ||
path: ${{ env.CONFIG_FILE_PATH }} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -40,4 +40,3 @@ Testing/ | |
*.pt | ||
*.safetensors | ||
*.gguf | ||
*.vmfb |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
# Ignore diffs and treat as binary in Git and on GitHub. | ||
*.bin -diff | ||
*.bin binary linguist-generated |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
# Model artifacts | ||
# TODO(scotttodd): convert into a build/temp dir instead of the source dir | ||
*.onnx | ||
*.vmfb |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,172 @@ | ||
# ONNX Operator Tests | ||
|
||
This test suite exercises ONNX (Open Neural Network Exchange: https://onnx.ai/) | ||
operators (https://onnx.ai/onnx/operators/). | ||
|
||
Testing follows several stages: | ||
|
||
```mermaid | ||
graph LR | ||
Import -. "\n(offline)" .-> Compile | ||
Compile --> Run | ||
``` | ||
|
||
Importing is run "offline" and the outputs are checked in to the repository for | ||
ease of use in downstream projects and by developers who prefer to work directly | ||
with `.mlir` files and native (C/C++) tools. | ||
|
||
## Quickstart | ||
|
||
1. Set up your virtual environment and install requirements: | ||
|
||
```bash | ||
python -m venv .venv | ||
source .venv/bin/activate | ||
python -m pip install -r requirements.txt | ||
``` | ||
|
||
* To use `iree-compile` and `iree-run-module` from Python packages: | ||
|
||
```bash | ||
python -m pip install -r requirements-iree.txt | ||
``` | ||
|
||
* To use local versions of `iree-compile` and `iree-run-module`, put them on | ||
your `$PATH` ahead of your `.venv/Scripts` directory: | ||
|
||
```bash | ||
export PATH=path/to/iree-build:$PATH | ||
``` | ||
|
||
2. Run pytest using typical flags: | ||
|
||
```bash | ||
pytest \ | ||
-n auto \ | ||
-rA \ | ||
--timeout=30 \ | ||
--durations=20 \ | ||
--config-files=configs/onnx_ops_cpu_llvm_sync.json \ | ||
--report-log=/tmp/onnx_ops_cpu_logs.json | ||
``` | ||
|
||
See https://docs.pytest.org/en/stable/how-to/usage.html for other options. | ||
|
||
## Test case structure | ||
|
||
Each test case is a folder containing a few files: | ||
|
||
```text | ||
[test case name]/ | ||
model.mlir | ||
input_0.bin | ||
input_1.bin | ||
... | ||
output_0.bin | ||
output_1.bin | ||
... | ||
run_module_io_flags.txt | ||
``` | ||
|
||
Where: | ||
|
||
* `model.mlir` is in a format that is ready for use with `iree-compile`: | ||
|
||
```mlir | ||
module { | ||
func.func @test_add(%arg0: !torch.vtensor<[3,4,5],f32>, %arg1: !torch.vtensor<[3,4,5],f32>) -> !torch.vtensor<[3,4,5],f32> attributes {torch.onnx_meta.ir_version = 7 : si64, torch.onnx_meta.opset_version = 17 : si64, torch.onnx_meta.producer_name = "backend-test", torch.onnx_meta.producer_version = ""} { | ||
%none = torch.constant.none | ||
%0 = torch.operator "onnx.Add"(%arg0, %arg1) : (!torch.vtensor<[3,4,5],f32>, !torch.vtensor<[3,4,5],f32>) -> !torch.vtensor<[3,4,5],f32> | ||
return %0 : !torch.vtensor<[3,4,5],f32> | ||
} | ||
} | ||
``` | ||
|
||
* `input_0.bin` and `output_0.bin` files correspond to any number of program | ||
inputs and outputs for one test case | ||
* `run_module_io_flags.txt` is a flagfile for use with | ||
`iree-run-module --flagfile=run_module_io_flags.txt` of the format: | ||
|
||
```text | ||
--input=2x3xf32=@input_0.bin | ||
--expected_output=2x3xf32=@output_0.bin | ||
``` | ||
|
||
## Running tests | ||
|
||
Tests are run using the [pytest](https://docs.pytest.org/en/stable/) framework. | ||
|
||
A [`conftest.py`](conftest.py) file collects test cases from subdirectories, | ||
wrapping each directory matching the format described above to one test case | ||
per test configuration. Test configurations are defined in JSON config files | ||
like [`configs/onnx_ops_cpu_llvm_sync.json`](./configs/onnx_ops_cpu_llvm_sync.json). | ||
|
||
### Updating expected failure lists | ||
|
||
Each config file uses with pytest includes a list of expected compile and run | ||
failures like this: | ||
|
||
```json | ||
"expected_compile_failures": [ | ||
"test_acos", | ||
], | ||
"expected_run_failures": [ | ||
"test_add_uint8", | ||
], | ||
``` | ||
|
||
To update these lists using the results of a test run: | ||
|
||
1. Run pytest with the `--report-log` option: | ||
|
||
```bash | ||
pytest \ | ||
--report-log=/tmp/onnx_ops_cpu_logs.json \ | ||
--config-files=onnx_ops_cpu_llvm_sync.json \ | ||
... | ||
``` | ||
|
||
2. Run the `update_config_xfails.py` script: | ||
|
||
```bash | ||
python update_config_xfails.py \ | ||
--log-file=/tmp/onnx_ops_cpu_logs.json \ | ||
--config-file=onnx_ops_cpu_llvm_sync.json | ||
``` | ||
|
||
You can also update the config JSON files manually. The log output on its own | ||
should give enough information for each test case (e.g. | ||
"remove from 'expected_run_failures'" for newly passing tests), but there can be | ||
1000+ test cases, so the automation can save time. | ||
|
||
### Advanced pytest usage | ||
|
||
* The `--ignore-xfails` option will ignore any expected compile or runtime | ||
failures. | ||
* The `--skip-all-runs` option will only `iree-compile` tests, not | ||
`iree-run-module` tests. | ||
|
||
## Generating tests | ||
|
||
Test cases are imported from upstream ONNX tests: | ||
|
||
Directory in [onnx/onnx](https://github.com/onnx/onnx/) | Description | ||
-- | -- | ||
[`onnx/backend/test/case/`](https://github.com/onnx/onnx/tree/main/onnx/backend/test/case) | Python source files | ||
[`onnx/backend/test/data/`](https://github.com/onnx/onnx/tree/main/onnx/backend/test/data) | Generated `.onnx` and `[input,output]_[0-9]+.pb` files | ||
|
||
The [`import_onnx_tests.py`](./onnx/import_onnx_tests.py) script walks the | ||
`data/` folder and generates test cases into our local | ||
[`generated/` folder](./generated/). | ||
|
||
To regenerate the test cases: | ||
|
||
```bash | ||
# Virtual environment setup. | ||
python -m venv .venv | ||
source .venv/bin/activate | ||
python -m pip install -r requirements-dev.txt | ||
|
||
# Import all test cases (may take a few minutes). | ||
python import_onnx_tests.py | ||
``` |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@renxida I took some ideas from nod-ai/SHARK-TestSuite#306 here when updating this README