Hello,
When I set my configuration on the SdA code to the following:
sda = SdA(
numpy_rng=numpy_rng,
n_ins=2880,
hidden_layers_sizes=[3000, 3000, 3000],
n_outs=4
)
I am getting the following error:
Traceback (most recent call last):
File "/home/dan/PycharmProjects/DeepLearningTutorials/code/SdA.py", line 491, in
test_SdA()
File "/home/dan/PycharmProjects/DeepLearningTutorials/code/SdA.py", line 440, in test_SdA
minibatch_avg_cost = train_fn(minibatch_index)
File "/home/dan/.local/lib/python2.7/site-packages/theano/compile/function_module.py", line 912, in call
storage_map=getattr(self.fn, 'storage_map', None))
File "/home/dan/.local/lib/python2.7/site-packages/theano/gof/link.py", line 314, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/dan/.local/lib/python2.7/site-packages/theano/compile/function_module.py", line 899, in call
self.fn() if output_subset is None else
ValueError: y_i value out of bounds
Apply node that caused the error: CrossentropySoftmaxArgmax1HotWithBias(Dot22.0, b, Elemwise{Cast{int32}}.0)
Toposort index: 26
Inputs types: [TensorType(float64, matrix), TensorType(float64, vector), TensorType(int32, vector)]
Inputs shapes: [(1, 4), (4,), (1,)]
Inputs strides: [(32, 8), (8,), (4,)]
Inputs values: [array([[ 0., 0., 0., 0.]]), array([ 0., 0., 0., 0.]), array([4], dtype=int32)]
Outputs clients: [[Sum{acc_dtype=float64}(CrossentropySoftmaxArgmax1HotWithBias.0)], [CrossentropySoftmax1HotWithBiasDx(Elemwise{Inv}[(0, 0)].0, CrossentropySoftmaxArgmax1HotWithBias.1, Elemwise{Cast{int32}}.0)], []]
Backtrace when the node is created(use Theano flag traceback.limit=N to make it longer):
File "/home/dan/PycharmProjects/DeepLearningTutorials/code/SdA.py", line 491, in
test_SdA()
File "/home/dan/PycharmProjects/DeepLearningTutorials/code/SdA.py", line 374, in test_SdA
n_outs=4
File "/home/dan/PycharmProjects/DeepLearningTutorials/code/SdA.py", line 177, in init
self.finetune_cost = self.logLayer.negative_log_likelihood(self.y)
File "/home/dan/PycharmProjects/DeepLearningTutorials/code/logistic_sgd.py", line 147, in negative_log_likelihood
return -T.mean(T.log(self.p_y_given_x)[T.arange(y.shape[0]), y])
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.
What might be causing this? This doesn't occur when the "n_outs" variable is set to 5 or 10, or so.