Comments (3)
So I have been thinking about indexes for a while. At a minimum, I believe we should support both integer and range-based indexes:
iex> t = Nx.tensor([0.0, 1.0, 2.0])
iex> t[0]
#Nx.Tensor<
f64
0.0
>
iex> t = Nx.tensor([0.0, 1.0, 2.0])
iex> t[1..2]
#Nx.Tensor<
f64[2]
[1.0, 2.0]
>
For multi-dimensional ones:
iex> t = Nx.tensor([[0.0, 1.0], [2.0, 3.0]])
iex> t[0]
#Nx.Tensor<
f64[2]
[0.0, 1.0]
>
iex> t[0][1]
#Nx.Tensor<
f64
1.0
>
However, I think the above is too limited and it doesn't leverage named tensors.
Named slices
In order to support named slices, we can do:
iex> t = Nx.tensor([[0.0, 1.0], [2.0, 3.0]], names: [:x, :y])
iex> t[x: 0]
#Nx.Tensor<
f64[2]
[0.0, 1.0]
>
iex> t[x: 0, y: 1]
#Nx.Tensor<
f64
1.0
>
However, more importantly, if you specify a named slice for a lower dimension, it is automatically implied to keep a higher dimension:
iex> t = Nx.tensor([[0.0, 1.0], [2.0, 3.0]], names: [:x, :y])
iex> t[x: 0]
#Nx.Tensor<
f64[2]
[0.0, 1.0]
>
iex> t[y: 1]
#Nx.Tensor<
f64[2]
[1.0, 3.0]
>
You can also pass ranges:
iex> t = Nx.tensor([[0.0, 1.0], [2.0, 3.0]], names: [:x, :y])
iex> t[x: 0..1]
#Nx.Tensor<
f64[2][2]
[[0.0, 1.0], [2.0, 3.0]]
>
iex> t[y: 1..1]
#Nx.Tensor<
f64[2][1]
[[1.0], [3.0]]
>
I like this approach a lot because the most idiomatic (named tensors) is also the cleanest and more efficient. Finally, note the syntax above is a generalization for:
opts = [axis: integer_or_range]
t[opts]
Where opts
can also be written as [{:axis, integer_or_range}]
. Therefore, for completeness, we will also allow the axis to be the axis index as an integer.
Further work
The proposal so far gives us a starting point - but it is literally just a starting point. Other things we need to consider are:
-
The syntax above should also be supported by put_in and most likely update_in depending on support for EXLA
-
Besides integers and ranges, we should also support passing tensors. For the general
t[other_tensor]
usage, can other-tensor be multidimensional? And if so, should we match the names? For the named usage, such ast[axis: other_tensor]
, thenother_tensor
must be one dimensional -
Numpy supports np.newaxis to add new dimensions, something we can achieve with reshape. Should we also provide syntax sugar for it via access? For example:
iex> t = Nx.tensor([1, 2, 3])
iex> Nx.add(t[Nx.newaxis()], t[0..-1][Np.newaxis()])
#Nx.Tensor<
s64[3][3]
[
[0, 1, 2],
[1, 2, 3],
[2, 3, 4]
]
>
Personally speaking, I am not convinced. If this is necessary, I would rather add an API such as:
iex> t = Nx.tensor([1, 2, 3])
iex> Nx.add(Nx.newaxis(t, 0), Nx.newaxis(t, -1))
#Nx.Tensor<
s64[3][3]
[
[0, 1, 2],
[1, 2, 3],
[2, 3, 4]
]
>
from nx.
From EXLA, we will be able to support update_in
via: DynamicUpdateSlice. put_in
I believe we can support with DynamicUpdateSlice as well. There's also Gather and Scatter which both might prove useful here, although they are a bit complex.
For slicing, XLA enforces that passed tensors must have a scalar shape: DynamicSlice, but we might be able to work around that. Although I don't yet have an idea how.
NumPy's advanced indexing looks interesting. It seems they support passing multi-dimensional arrays, and use broadcasting to line everything up with the dimensions of the ndarray. I am almost positive we would have to use something like XLA's gather to make all of that work.
I prefer:
Nx.newaxis(t, 0)
over:
t[Nx.newaxis()]
But, would something like this be possible:
t = Nx.iota({3, 4, 3}, names: [:x, :y, :z]) # shape is {3, 4, 3}
t[batch: 32] # implicitly expand dimensions, so shape is now {32, 3, 4, 3}, this is a reshape+broadcast
t[x: 0..3][new_dim: 4][y: 1..2][z: 1..2] # specify ranges, this is slice+reshape+broadcast
I'm not sure I'm sold on that syntax, but something similar might be interesting to introduce.
from nx.
Building off of this, perhaps we could build a lot of our operations off of this syntax. For example transpose can be represented by just switching the order of names around. In this way, we could achieve something similar to einsum, without the strings
from nx.
Related Issues (20)
- Convert Nx.LinAlg.qr to optional callback HOT 1
- Convert Nx.LinAlg.cholesky to optional callback
- Fix sign of zero values in Nx.ceil and Nx.conjugate HOT 1
- Add executable-level caching to Nx.Defn in EXLA HOT 1
- Nx.LinAlg.solve/2 with Nx.BinaryBackend returns tensor with wrong state HOT 6
- Nx.all_close not working with EXLA HOT 13
- How to Set XLA log level?? HOT 2
- could not compile dependency :exla HOT 3
- Implement Nx.stack as a default callback
- Expand docs for the :axes option in Nx.gather/3 HOT 2
- Remove xla compiler_mode
- Use regions when compiling `if` in MLIR HOT 1
- Quantization via MLIR
- Special node acceleration via metadata HOT 6
- Import and export of MLIR modules
- function Torchx.__jit__/5 is undefined or private HOT 2
- Geometric / Clifford algebra in arbitrary dimensions HOT 4
- Automatically track which variables are inside if/cond/while
- Data Loaders in Nx? HOT 6
- Cannot transform dummy columns to Nx Tensors via Nx.stack HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from nx.