Giter Site home page Giter Site logo

iree-torch's Introduction

Inactivity Notice

Note We haven't been able to invest as much time into this as we'd like lately. For a more actively supported packaged solution combining torch-mlir and IREE, see SHARK.

Torch Frontend for IREE

This project provides end-to-end flows supporting users of PyTorch that want to target IREE as a compiler backend, which offers a number of benefits. We use the Torch-MLIR project to provide our PyTorch frontend.

This project is under active development and is subject to frequent changes.

Example Usage

Training & Inference (functorch-based)

An end-to-end example of training a PyTorch basic regression model on IREE can be found in this script. This script uses functorch to define the model's forward and backward pass.

Inference (nn.Module-based)

An end-to-end example of compiling an nn.Module-based PyTorch BERT model to IREE can be found in this notebook. The notebook also demonstrates the significantly smaller runtime size of the compiled model when compared to PyTorch (~4MB versus ~700MB).

Native, On-device Training

A small (~100-250KB), self-contained binary can be built for deploying to resource-constrained environments. An example illustrating this can be found in this example. This binary runs a model without a Python interpreter.

Planned features

  • Python (or, if absolutely necessary, C++) code that pulls in the bindings from both projects into an end-to-end flow for users.
  • Docker images for users to be able to quickly get started
  • CI of the Torch-MLIR end-to-end tests, with IREE plugged in as a backend
  • User examples:
    • Jupyter notebooks using the above to demonstrate interactive use of the tools
    • Standalone user-level Python code demonstrating various deployment flows (mobile, embedded).

Running end-to-end correctness tests

Setup the venv for running:

# Create a Python virtual environment.
$ python -m venv iree-torch.venv
$ source iree-torch.venv/bin/activate

# Option 1: Install Torch-MLIR and IREE from nightly packages:
(iree-torch.venv) $ python -m pip install -r "${IREE_TORCH_SRC_ROOT}/requirements.txt"

# Option 2: For development, build from source and set `PYTHONPATH`:
ninja -C "${TORCH_MLIR_BUILD_ROOT}" TorchMLIRPythonModules
ninja -C "${IREE_BUILD_ROOT}" IREECompilerPythonModules bindings_python_iree_runtime_runtime
export PYTHONPATH="${IREE_BUILD_ROOT}/runtime/bindings/python:${IREE_BUILD_ROOT}/compiler/bindings/python:${TORCH_MLIR_BUILD_ROOT}/tools/torch-mlir/python_packages/torch_mlir:${PYTHONPATH}"

Run the Torch-MLIR TorchScript e2e test suite on IREE:

# Run all the tests on the default backend (`llvm-cpu`).
(iree-torch.venv) $ tools/e2e_test.sh
# Run all tests on the `vmvx` backend.
(iree-torch.venv) $ tools/e2e_test.sh --config vmvx
# Filter the tests (with a regex) and report failures with verbose error messages.
# This is good for drilling down on a single test as well.
(iree-torch.venv) $ tools/e2e_test.sh --filter Elementwise --verbose
# Shorter option names.
(iree-torch.venv) $ tools/e2e_test.sh -f Elementwise -v

iree-torch's People

Contributors

dellis23 avatar ramiro050 avatar silvasean avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

iree-torch's Issues

Run examples/bert.py, it report expected mlir::RankedTensorType, but got: 'i32'

When I run the demo in 'examples/bert.py' and invoke iree_torch.compile_to_vmfb(linalg_on_tensors_mlir, args.iree_backend), it reports expected mlir::RankedTensorType, but got: 'i32'

Complete information is below.

Compiling with IREE
Traceback (most recent call last):
  File "../iree-torch/examples/bert.py", line 115, in <module>
    main()
  File "../iree-torch/examples/bert.py", line 102, in main
    iree_vmfb = iree_torch.compile_to_vmfb(linalg_on_tensors_mlir, args.iree_backend)
  File "~/anaconda3/envs/python38/lib/python3.8/site-packages/iree_torch/__init__.py", line 103, in compile_to_vmfb
    return ireec.compile_str(bytecode,
  File "~/anaconda3/envs/python38/lib/python3.8/site-packages/iree/compiler/tools/core.py", line 277, in compile_str
    result = invoke_immediate(cl, immediate_input=input_bytes)
  File "~/anaconda3/envs/python38/lib/python3.8/site-packages/iree/compiler/tools/binaries.py", line 196, in invoke_immediate
    raise CompilerToolError(process)
iree.compiler.tools.binaries.CompilerToolError: Error invoking IREE compiler tool iree-compile
Diagnostics:
<stdin>:0:0: error: expected mlir::RankedTensorType, but got: 'i32'
<stdin>:0:0: note: in bytecode version 0 produced by: MLIR16.0.0git

error: bytecode version - when I try the bert example.

I get o bytecode version error (last lines pasted here) when I tried bert.py, I used some instructions from here: https://openxla.github.io/iree/getting-started/pytorch/

Do you have any hints to solve the bytecode problem?

Diagnostics:
<stdin>:0:0: error: bytecode version 1 is newer than the current version 0


Invoked with:
 iree-compile 
/usr/lib/python3.10/site-packages/iree/compiler/tools/../_mlir_libs/iree-compile - 
--iree-input-type=tm_tensor --iree-vm-bytecode-module-output-format=flatbuffer-binary 
--iree-hal-target-backends=llvm-cpu 
--iree-llvm-embedded-linker-path=/usr/lib/python3.10/site-packages/iree/compiler/tools/../_
mlir_libs/iree-lld --mlir-print-debuginfo --mlir-print-op-on-diagnostic=false

Need more information? Set IREE_SAVE_TEMPS=/some/dir in your environment to save all 
artifacts and reproducers.```

Example Colab notebook fails to install torch-mlir (using Python 3.8)

This notebook file: https://github.com/iree-org/iree-torch/blob/main/examples/bert.ipynb when opend in Colab (https://colab.research.google.com/github/iree-org/iree-torch/blob/main/examples/bert.ipynb) fails to install torch-mlir:

!pip install -f https://llvm.github.io/torch-mlir/package-index/ torch-mlir
!pip install -f https://iree-org.github.io/iree/pip-release-links.html iree-compiler iree-runtime
!pip install git+https://github.com/iree-org/iree-torch.git
!pip install transformers

Looking in links: https://llvm.github.io/torch-mlir/package-index/
ERROR: Could not find a version that satisfies the requirement torch-mlir (from versions: none)
ERROR: No matching distribution found for torch-mlir

fail: pip install git+https://github.com/iree-org/iree-torch.git, How can I install iree-torch with existing torch_mlir?

When I tried to install iree-torch by 'pip install git+https://github.com/iree-org/iree-torch.git', it reports below.

Collecting git+https://github.com/iree-org/iree-torch.git
Cloning https://github.com/iree-org/iree-torch.git to /tmp/pip-req-build-20px4dwa
Running command git clone --filter=blob:none --quiet https://github.com/iree-org/iree-torch.git /tmp/pip-req-build-20px4dwa
Resolved https://github.com/iree-org/iree-torch.git to commit 5fbaf50
Running command git submodule update --init --recursive -q
Preparing metadata (setup.py) ... done
Requirement already satisfied: iree-compiler in /anaconda3/envs/python38/lib/python3.8/site-packages (from iree-torch==0.0.1) (20220930.282)
Requirement already satisfied: iree-runtime in /anaconda3/envs/python38/lib/python3.8/site-packages (from iree-torch==0.0.1) (20220930.282)
ERROR: Could not find a version that satisfies the requirement torch-mlir (from iree-torch) (from versions: none)
ERROR: No matching distribution found for torch-mlir

I know from another issue that I can install torch_mlir individually, however, I have built torch_mlir from source code and set pythonpath. I can import torch_mlir in other python source code successfully, like below.

Python 3.8.15 (default, Nov 24 2022, 15:19:38)
[GCC 11.2.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.

import torch
import torch_mlir
import torch_mlir_e2e_test

How can I install iree-torch with existing torch_mlir?

Installation occurs during pip install

When I executed pip install -r requirements.txt on Sep. 17, the following error occurs. Do I make mistakes somewhere?

$ python -m pip -V
pip 22.2.2 from ...
$ python --version
Python 3.9.7
$ python -m pip install -r  iree-torch/requirements.txt 
Looking in links: https://github.com/google/iree/releases, https://github.com/llvm/torch-mlir/releases, https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html
Collecting iree-compiler
  Using cached iree_compiler-20220902.254-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (48.1 MB)
Collecting iree-runtime
  Using cached iree_runtime-20220902.254-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.3 MB)
ERROR: Could not find a version that satisfies the requirement torch-mlir (from versions: none)
ERROR: No matching distribution found for torch-mlir

error: dialect 'arith' provides no attribute parsing hook

When trying to run this notebook:

---------------------------------------------------------------------------
CompilerToolError                         Traceback (most recent call last)
[<ipython-input-3-2bf71e5586d0>](https://localhost:8080/#) in <module>
     26 # cuda - GPU for NVIDIA devices
     27 iree_backend = "llvm-cpu"
---> 28 iree_vmfb = iree_torch.compile_to_vmfb(linalg_on_tensors_mlir, iree_backend)
     29 
     30 print("Loading in IREE")

2 frames
[/usr/local/lib/python3.7/dist-packages/iree/compiler/tools/binaries.py](https://localhost:8080/#) in invoke_immediate(command_line, input_file, immediate_input)
    194     process = subprocess.run(command_line, capture_output=True, **run_args)
    195     if process.returncode != 0:
--> 196       raise CompilerToolError(process)
    197     # Emit stderr contents.
    198     _write_binary_stderr(stderr_handle, process.stderr)

CompilerToolError: Error invoking IREE compiler tool iree-compile
Diagnostics:
<mlir_parser_buffer>:1:8: error: dialect 'arith' provides no attribute parsing hook
#arith.fastmath<none>
       ^


Invoked with:
 iree-compile /usr/local/lib/python3.7/dist-packages/iree/compiler/tools/../_mlir_libs/iree-compile - --iree-input-type=tm_tensor --iree-vm-bytecode-module-output-format=flatbuffer-binary --iree-hal-target-backends=llvm-cpu --iree-llvm-embedded-linker-path=/usr/local/lib/python3.7/dist-packages/iree/compiler/tools/../_mlir_libs/iree-lld --mlir-print-debuginfo --mlir-print-op-on-diagnostic=false

Need more information? Set IREE_SAVE_TEMPS=/some/dir in your environment to save all artifacts and reproducers.

This also happens on a clean install in this script.

May need to remove pillow<7 from requirements.txt

Following steps listed in https://github.com/iree-org/iree-torch#running-end-to-end-correctness-tests, I ran

python -m pip install -r requirements.txt

It gave me the following errors.

  note: This error originates from a subprocess, and is likely not a problem with pip.
  Rolling back uninstall of Pillow
  Moving to /Users/y/miniforge3/envs/torch-mlir/lib/python3.10/site-packages/PIL/
   from /Users/y/miniforge3/envs/torch-mlir/lib/python3.10/site-packages/~IL
  Moving to /Users/y/miniforge3/envs/torch-mlir/lib/python3.10/site-packages/Pillow-9.4.0.dist-info/
   from /Users/y/miniforge3/envs/torch-mlir/lib/python3.10/site-packages/~illow-9.4.0.dist-info
error: legacy-install-failure

It turns out that a contemporary version of pillow is 9.4.0. However, requirements.txt wants version <7.

pillow<7

import functorch fails (cannot import name 'set_autograd_function_allowed' from 'torch._C._functorch')

Hi!

I'm new to IREE and tried to follow instructions at https://iree-org.github.io/iree/getting-started/pytorch/ to run a simple example provided there, https://github.com/iree-org/iree-torch/blob/main/examples/regression.py.

However, and even when I run regression.py, or even smth as simple as import functorch I am getting an error:

>>> import functorch
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/nfs_home/kvoronin/anaconda/envs/iree_try_venv/lib/python3.10/site-packages/functorch/__init__.py", line 16, in <module>
    from ._src.vmap import vmap
  File "/nfs_home/kvoronin/anaconda/envs/iree_try_venv/lib/python3.10/site-packages/functorch/_src/vmap/__init__.py", line 4, in <module>
    from torch._functorch.vmap import (
  File "/nfs_home/kvoronin/anaconda/envs/iree_try_venv/lib/python3.10/site-packages/torch/_functorch/vmap.py", line 24, in <module>
    from torch._functorch.utils import exposed_in
  File "/nfs_home/kvoronin/anaconda/envs/iree_try_venv/lib/python3.10/site-packages/torch/_functorch/utils.py", line 3, in <module>
    from torch._C._functorch import (
ImportError: cannot import name 'set_autograd_function_allowed' from 'torch._C._functorch' (unknown location)

Python 3.10.8
torch 2.0.0.dev20230106+cpu
print(torch.version)
1.13.1

I guess it is some version mismatch, but the set_autograd_function_allowed is badly google-able. Any advice would be welcome!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.