Comments (4)
related topic:
https://pytorch.org/docs/stable/generated/torch.nn.ModuleList.html
from pytorch.
Ok I looked at this, and here is whats happening:
The module we are tracing is:
class ModuleList(torch.nn.Module):
def __init__(self):
super().__init__()
self.layers = torch.nn.ModuleList(
[
torch.nn.Linear(10, 10),
]
)
def forward(self, x):
for idx, layer in enumerate(self.layers[::-1]):
# pass
x = layer(x) * idx
return x
- when inlining disabled, when we trace through
for idx, layer in enumerate(self.layers[::-1]):
we need to call getitem but this is special handled in
pytorch/torch/_dynamo/variables/nn_module.py
Line 600 in 796dff7
and we do not trace through getitem for ModuleList.
- When we enable we trace through getItem which ends up calling
def len(self) -> int:
pytorch/torch/nn/modules/container.py
Line 140 in 796dff7
and we fail to trace through the length call.
from pytorch.
basically we fail exactly here:
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/functions.py", line 341, in call_function
return super().call_function(tx, args, kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/functions.py", line 293, in call_function
return super().call_function(tx, args, kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/functions.py", line 90, in call_function
return tx.inline_user_function_return(self, [*self.self_args(), *args], kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 743, in inline_user_function_return
return InliningInstructionTranslator.inline_call(self, fn, args, kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 2447, in inline_call
return cls.inline_call_(parent, func, args, kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 2563, in inline_call_
tracer.run()
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 884, in run
while self.step():
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 799, in step
self.dispatch_table[inst.opcode](self, inst)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 494, in wrapper
return inner_fn(self, inst)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 1253, in CALL_FUNCTION
self.call_function(fn, args, {})
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/symbolic_convert.py", line 737, in call_function
self.push(fn.call_function(self, args, kwargs))
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/builtin.py", line 948, in call_function
return handler(tx, args, kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/builtin.py", line 832, in builtin_dipatch
rv = fn(tx, args, kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/builtin.py", line 750, in call_self_handler
result = self_handler(tx, *args, **kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/builtin.py", line 1347, in call_len
return args[0].call_method(tx, "__len__", args[1:], kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/misc.py", line 694, in call_method
return super().call_method(tx, name, args, kwargs)
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/variables/base.py", line 320, in call_method
unimplemented(f"call_method {self} {name} {args} {kwargs}")
File "/data/users/lsakka/pytorch/pytorch/torch/_dynamo/exc.py", line 216, in unimplemented
raise Unsupported(msg)
torch._dynamo.exc.Unsupported: call_method GetAttrVariable(UnspecializedNNModuleVariable(ModuleList), _modules) __len__ () {}
V0516 16:12:16.275000 140569933899584 torch/_dynamo/symbolic_convert.py:2535] [0/0] INLINING <code object __len__ at 0x7fd8f216a4a0, file "/data/users/lsakka/pytorch/pytorch/torch/nn/modules/container.py", line 311>, inlined according trace_rules.lookup MOD_INLINELIST
V0516 16:12:16.275000 140569933899584 torch/_dynamo/symbolic_convert.py:769] [0/0] [__trace_source] TRACE starts_line /data/users/lsakka/pytorch/pytorch/torch/nn/modules/container.py:313 in __len__ (ModuleList.__len__) (inline depth: 5)
V0516 16:12:16.275000 140569933899584 torch/_dynamo/symbolic_convert.py:769] [0/0] [__trace_source] return len(self._modules)
V0516 16:12:16.276000 140569933899584 torch/_dynamo/symbolic_convert.py:792] [0/0] [__trace_bytecode] TRACE LOAD_GLOBAL len []
V0516 16:12:16.276000 140569933899584 torch/_dynamo/symbolic_convert.py:792] [0/0] [__trace_bytecode] TRACE LOAD_FAST self [BuiltinVariable()]
V0516 16:12:16.276000 140569933899584 torch/_dynamo/symbolic_convert.py:792] [0/0] [__trace_bytecode] TRACE LOAD_ATTR _modules [BuiltinVariable(), UnspecializedNNModuleVariable()]
V0516 16:12:16.277000 140569933899584 torch/_dynamo/symbolic_convert.py:792] [0/0] [__trace_bytecode] TRACE CALL_FUNCTION 1 [BuiltinVariable(), GetAttrVariable()]
def call_method(
self,
tx,
name,
args: "List[VariableTracker]",
kwargs: "Dict[str, VariableTracker]",
) -> "VariableTracker":
if name == "__len__" and self.has_unpack_var_sequence(tx):
assert not (args or kwargs)
return variables.ConstantVariable.create(len(self.unpack_var_sequence(tx)))
elif (
name == "__getattr__"
and len(args) == 1
and args[0].is_python_constant()
and not kwargs
):
return self.var_getattr(tx, args[0].as_python_constant())
unimplemented(f"call_method {self} {name} {args} {kwargs}")
from pytorch.
I think the issue is that self._module is a dictionary but we represent that as GetAttrVariable instead of ConstDictVariable, but i do now know if we its constant and if we can use ConstDictVariable,
the type of _module is _modules: Dict[str, Module] # type: ignore[assignment]
from pytorch.
Related Issues (20)
- [user empathy day 2][based] torch.compile issues
- Map with multiple arguments not supported in Dynamo and causes graph breaks HOT 2
- inductor error when torch.compile on distrifuser
- [User Empathy Day 2] non-deterministic recompiles for ChatTTS model
- [user empathy day 2] dynamo raises exception when tracing super(Fraction, cls).__new__(cls)
- custom ops with needs_fixed_stride_order doesn't work with auto_functionalized
- `slice step cannot be zero` HOT 1
- Verify that guards are well formed before concluding that Dynamo complication has succeeded
- DISABLED inductor / cuda12.1-py3.10-gcc9-sm80 / test (inductor_torchbench_smoketest_perf) HOT 1
- [inductor] error in graph lowering and graph breaks
- Vmap batching rule support for torch.isin
- Import Error: cannot import name 'XNNPACKQuantizer' from 'torch.ao.quantization.quantizer'
- A trivial but annoying bug in random_split
- DISABLED test_quantization_doc_qat (__main__.TestQuantizationDocs) HOT 1
- DISABLED test_flash_attention_vs_math_ref_grads_batch_size_8_seq_len_q_2048_seq_len_k_2048_head_dim_64_is_causal_True_dropout_p_0_22_bfloat16_scale_l1_cuda_bfloat16 (__main__.TestSDPACudaOnlyCUDA) HOT 1
- [Prelu_backward]: x, grad_out or weight is scalar, get wrong result
- 2nd compile of deepcopy(model) fails on multiple ubuntu-pc (fatal error: Python.h: file not found)
- [Dynamo][TVM] TVM backend not work on macOS nor Linux aarch64
- optional memory leak with torch.jit.script on models with fft
- Stateless random numbers HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pytorch.