I would like to add a higher-level looping construct to defn
. In Elixir, we can use for+:reduce
but I am afraid it will be too verbose and foreign for new users. Let's imagine we want to translate this python code:
def _smooth(x):
out = np.empty_like(x)
for i in range(1, x.shape[0] - 1):
for j in range(1, x.shape[1] - 1):
out[i, j] = x[i + -1, j + -1] + x[i + -1, j + 0] + x[i + -1, j + 1] +
x[i + 0, j + -1] + x[i + 0, j + 0] + x[i + 0, j + 1] +
x[i + 1, j + -1] + x[i + 1, j + 0] + x[i + 1, j + 1]) // 9
return out
With for+:reduce, we would write it as:
def smooth(x) do
for i <- 1..elem(x.shape, 0)-1, j <- 1..elem(x.shape, 1)-1, reduce: x do
x ->
put_in x[i, j], x[i + -1, j + -1] + x[i + -1, j + 0] + x[i + -1, j + 1] +
x[i + 0, j + -1] + x[i + 0, j + 0] + x[i + 0, j + 1] +
x[i + 1, j + -1] + x[i + 1, j + 0] + x[i + 1, j + 1]) / 9
end
end
I propose we introduce a loop construct, inspired by futhark that looks like this:
loop tuple_or_var [= expr], [pattern <- expr]+ do
end
Rewriting the above to this loop construct, we have:
def smooth(x) do
loop x,
i <- 1..elem(x.shape, 0)-1,
j <- 1..elem(x.shape, 1)-1 do
put_in x[i, j], x[i + -1, j + -1] + x[i + -1, j + 0] + x[i + -1, j + 1] +
x[i + 0, j + -1] + x[i + 0, j + 0] + x[i + 0, j + 1] +
x[i + 1, j + -1] + x[i + 1, j + 0] + x[i + 1, j + 1]) / 9
end
end
There is one downside with this approach: the only form of loops we have in XLA are while loops which are sequential. Other languages, such as taichi, can optimize them to run in parallel. For this reason, we may want to introduce higher level constructs for manipulating tensors. In particular, I believe we should introduce functions such as map
, map_with_index
, reduce
and reduce_with_index
. I have some thoughts in how we can implement said functions so they also work with batching out of the box, but I am waiting for some feedback on this issue before moving forward.