Giter Site home page Giter Site logo

neuromorphs / nirtorch Goto Github PK

View Code? Open in Web Editor NEW
7.0 9.0 3.0 134 KB

PyTorch helper module to translate to and from NIR

Home Page: https://nnir.readthedocs.io

License: BSD 3-Clause "New" or "Revised" License

Python 98.15% Nix 1.85%
intermediate-representation machine-learning neuromorphic

nirtorch's People

Contributors

bauerfe avatar biphasic avatar jegp avatar sheiksadique avatar stevenabreu7 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nirtorch's Issues

Allow stateful modules in the graph executor

The graph executor currently doesn't allow modules to have state. If we have a tensor where the outer dimension is time, then we are forced to run all the frames through each node, like so:

tensor = ... # Time x Batch x ...
module(tensor)

We might want to support iterative application, which is for instance necessary for real-time modules:

tensor = ... # Time x Batch x ...
state = None
for frame in tensor:
  out, new_state = module(frame)

This is the (functional) model Norse is following, so we would have to make sure this also works for libraries with mutable state (where the state variable is not explicitly passed).

GraphExecutor submodules: indexing and missing input node

Since the indexing was changed to strings, code like this

    batch_size = 4

    orig_model = nn.Sequential(
        torch.nn.Linear(10, 2),
        sl.ExpLeakSqueeze(tau_mem=10.0, batch_size=batch_size),
        sl.LIFSqueeze(tau_mem=10.0, batch_size=batch_size),
        torch.nn.Linear(2, 1),
    )
    nir_graph = to_nir(orig_model, torch.randn(batch_size, 10))

    converted_model = from_nir(nir_graph, batch_size=batch_size)

returns an object

GraphExecutor(
  (output): None
  (0): Linear(in_features=10, out_features=2, bias=True)
  (1): ExpLeakSqueeze(tau_mem=10.0, norm_input=False, batch_size=4, num_timesteps=-1)
  (2): LIFSqueeze(tau_mem=10.0, spike_threshold=Parameter containing:
  tensor(1.), norm_input=False, batch_size=4, num_timesteps=-1)
  (3): Linear(in_features=2, out_features=1, bias=True)

for converted_model.

  1. I can index the output node like converted_model.output, but not converted_model.0 because the zero is a string. converted_model['0'] doesn't work. converted_model.__getattr__('0') works but that is cumbersome.
  2. Is there supposed to be an 'input' module?

LICENSE is missing

Hiya,

Thanks a lot for your cool work! I'd like to package NIRTorch on conda-forge. For this, I need to know what the LICENSE is, and ideally I need a LICENSE file.

xref: neuromorphs/NIR#74

Best, Tobi

Support for nested NIRGraph

Currently, there is no explicit support for nested NIR graphs.

  • There should be a way to load graphs that are nested.
  • There should be a way to extract/generate graphs that are nested.

Input node retains batch dimension

Since in pytorch, the data is always expected to have the batch dimension, this shape is also retained in the input shape. This is not consistent with NIR specifications that does not include the batch dimension.

cannot import and then export

when importing a NIR graph through NIRTorch, and then exporting that back to NIR, it messes up the edges of the NIR graph:

original NIR edges [('input', '0'), ('0', '1'), ('1', 'output')]
converted NIR edges [('input', '0'), ('0', 'output'), ('1', '1')]

in a simple sequential network with two nodes, it never connects them:

Tensor_0(1, 1) (Tensor)
	-> 0 (Linear) [torch.Size([1, 1])]
0 (Linear)
	-> Tensor_1(1, 1) (Tensor) [torch.Size([1, 1])]
Tensor_1(1, 1) (Tensor)
Tensor_2(1, 1) (Tensor)
	-> 1 (Leaky) [torch.Size([1, 1])]
1 (Leaky)
	-> Tensor_3(1, 1) (Tensor) [torch.Size([1, 1])]
Tensor_3(1, 1) (Tensor)

in the new forward pass inside the graphexecutor, we are creating a new input based on the graph executor state (potentially summing together multiple inputs). pytorch then sees this input tensor as a new tensor, and the comparison with the old output tensor (which may be the same) fails.

Provide default mapping function for trivial pytorch functions from `torch.nn`

Description

Currently, the conversion to/from nir of the following pytorch function is handled individually in each pytorch-based framework:

  • torch.nn.Conv2d
  • torch.nn.Linear

In the future this might apply to other functions like:

  • torch.nn.Conv1d and other convolution operations
  • torch.nn.AvgPool2d and other average pooling operations
  • torch.nn.MaxPool2d and other max pooling operations

This is related to this issue in snntorch: jeshraghian/snntorch#304

Suggestion

To reduce the amount of redundant code, we could implement default mapper functions for these pytorch native operations. This default mapper function could be applied to a nn.Module ore nir.node when the framework specific mapper function does not supply the operation.

Changes for conversion from pytorch to NIR

The suggestion is to implement a default mapper function like _extract_default_model
and call it if there is no NIRNode found by the framework dependent model_map function provided: https://github.com/neuromorphs/NIRTorch/blob/main/nirtorch/to_nir.py#L70

Changes for conversion from NIR to pytorch

Extend function _switch_default_models in https://github.com/neuromorphs/NIRTorch/blob/main/nirtorch/from_nir.py#L188

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.