Comments (2)
thanks @StatMixedML trying to reproduce... can you try to train with device="cpu"
? I could not reproduce it on linux or mac... so might be a windows thing... Can you pull again and try with this commit I just pushed?
from pytorch-ts.
@kashif Thanks for the quick reply.
I have re-created my environment using your latest commits, also setting device="cpu". Same error
0it [00:02, ?it/s]
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-7-99cbb6cd87b6> in <module>
6 trainer=Trainer(epochs=10,
7 device="cpu"))
----> 8 predictor = estimator.train(training_data=training_data)
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\model\estimator.py in train(self, training_data)
133
134 def train(self, training_data: Dataset) -> Predictor:
--> 135 return self.train_model(training_data).predictor
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\model\estimator.py in train_model(self, training_data)
118 trained_net = self.create_training_network(self.trainer.device)
119
--> 120 self.trainer(
121 net=trained_net,
122 input_names=get_module_forward_input_names(trained_net),
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\trainer.py in __call__(self, net, input_names, data_loader)
50 inputs = [data_entry[k].to(self.device) for k in input_names]
51
---> 52 output = net(*inputs)
53 if isinstance(output, (list, tuple)):
54 loss = output[0]
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\torch\nn\modules\module.py in __call__(self, *input, **kwargs)
530 result = self._slow_forward(*input, **kwargs)
531 else:
--> 532 result = self.forward(*input, **kwargs)
533 for hook in self._forward_hooks.values():
534 hook_result = hook(self, input, result)
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\model\deepar\deepar_network.py in forward(self, feat_static_cat, feat_static_real, past_time_feat, past_target, past_observed_values, future_time_feat, future_target, future_observed_values)
244 future_observed_values: torch.Tensor,
245 ) -> torch.Tensor:
--> 246 distr = self.distribution(
247 feat_static_cat=feat_static_cat,
248 feat_static_real=feat_static_real,
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\model\deepar\deepar_network.py in distribution(self, feat_static_cat, feat_static_real, past_time_feat, past_target, past_observed_values, future_time_feat, future_target, future_observed_values)
219 future_observed_values: torch.Tensor,
220 ) -> Distribution:
--> 221 rnn_outputs, _, scale, _ = self.unroll_encoder(
222 feat_static_cat=feat_static_cat,
223 feat_static_real=feat_static_real,
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\model\deepar\deepar_network.py in unroll_encoder(self, feat_static_cat, feat_static_real, past_time_feat, past_target, past_observed_values, future_time_feat, future_target)
166
167 # (batch_size, num_features)
--> 168 embedded_cat = self.embedder(feat_static_cat)
169
170 # in addition to embedding features, use the log scale as it can help
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\torch\nn\modules\module.py in __call__(self, *input, **kwargs)
530 result = self._slow_forward(*input, **kwargs)
531 else:
--> 532 result = self.forward(*input, **kwargs)
533 for hook in self._forward_hooks.values():
534 hook_result = hook(self, input, result)
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\modules\feature.py in forward(self, features)
28
29 return torch.cat(
---> 30 [
31 embed(cat_feature_slice.squeeze(-1))
32 for embed, cat_feature_slice in zip(
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\pts\modules\feature.py in <listcomp>(.0)
29 return torch.cat(
30 [
---> 31 embed(cat_feature_slice.squeeze(-1))
32 for embed, cat_feature_slice in zip(
33 self.__embedders, cat_feature_slices
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\torch\nn\modules\module.py in __call__(self, *input, **kwargs)
530 result = self._slow_forward(*input, **kwargs)
531 else:
--> 532 result = self.forward(*input, **kwargs)
533 for hook in self._forward_hooks.values():
534 hook_result = hook(self, input, result)
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\torch\nn\modules\sparse.py in forward(self, input)
110
111 def forward(self, input):
--> 112 return F.embedding(
113 input, self.weight, self.padding_idx, self.max_norm,
114 self.norm_type, self.scale_grad_by_freq, self.sparse)
C:\ProgramData\Anaconda3\envs\pytorchts\lib\site-packages\torch\nn\functional.py in embedding(input, weight, padding_idx, max_norm, norm_type, scale_grad_by_freq, sparse)
1482 # remove once script supports set_grad_enabled
1483 _no_grad_embedding_renorm_(weight, input, max_norm, norm_type)
-> 1484 return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
1485
1486
RuntimeError: Expected tensor for argument #1 'indices' to have scalar type Long; but got torch.IntTensor instead (while checking arguments for embedding)
It indees seems to be a Windows related issue, since it is working in my Linux sub-system.
from pytorch-ts.
Related Issues (20)
- A question about the hyperparameter Settings of the model Time-Grad on both of Solar and Wikipedia datasets.
- Issue while runing the Readme
- can't generate dataset "pts_m5" HOT 5
- TypeError: `model` must be a `LightningModule` or `torch._dynamo.OptimizedModule`, got `TimeGradLightningModule`
- ValidationError: 1 validation error for PyTorchPredictorModel
- TypeError: PyTorchPredictor.__init__() got an unexpected keyword argument 'freq' HOT 14
- too many indices for array: array is 1-dimensional, but 2 were indexed
- Data imputation.
- TimeGrad Notebook version 0.7.0 -> predicts all nans HOT 8
- TimeGrad-electricity error
- Pytest pydantic throws an error
- Reproducing the results in "Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows" in need of Parameters
- ImportError: cannot import name 'PyTorchPredictor' from partially initialized module 'gluonts.torch.model.predictor' HOT 1
- Import Error in pts/modules/distribution_output.py
- TimeGrad: Adding Covariates as Conditioning Input HOT 4
- ModuleNotFoundError: No module named 'lightning' HOT 1
- TimeGrad Error: "target" does not have the requireddimension
- Module Import Issue in Version 0.70 HOT 4
- Error : Train TimeGradEstimator with ComplexSeasonalTimeSeries from GluonTS
- how can i apply TimeGrad to MS task?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pytorch-ts.