Giter Site home page Giter Site logo

Comments (6)

fabiodimarco avatar fabiodimarco commented on July 2, 2024

Hi, I think that the NARX base_estimator is not intended to be a keras model.
If I look into the comments of the NARX base class TimeSeriesRegressor the comment says:
base_estimator must be a model which implements the scikit-learn APIs.

Do you have any working code that use standard keras fit function?
If can provide me with that, I can help you using the levenberg-marquardt optimizer.

from tf-levenberg-marquardt.

TobiasEl avatar TobiasEl commented on July 2, 2024

Hi.
I need to create a NARX method (based on your LM). Do you know some implememtation of NARX with keras to use your LM?

from tf-levenberg-marquardt.

fabiodimarco avatar fabiodimarco commented on July 2, 2024

I was not able to find any open source keras implementation of NARX. If you want to use the LM training algorithm that I implemented I think you need to implement your own version of NARX using tensorflow keras api. Or in alternative you can try other model architectures that are already implemented in tensorflow (RNN / CNN).
What is your goal? and why do you want to use LM to train your model? Have you already tried first order methods (e.g. SGD, Adam etc.)?

from tf-levenberg-marquardt.

TobiasEl avatar TobiasEl commented on July 2, 2024

Hi.
Thanks for your help. But it must be NARX based on LM. It is because we want to emulate a paper method: https://www.mdpi.com/2227-7390/8/2/241/html . So it must be NARX under LM

I have found this: https://stackoverflow.com/questions/53087669/narx-implementation-using-keras

I made some changes to the last version of tf and keras. I comine with your implementation. And it looks like this:


import tensorflow as tf
import numpy as np
import levenberg_marquardt as lm
import numpy as np

from tensorflow import keras

import matplotlib.pyplot as plt
numPreviousSteps = 8
inputShape = (None, numPreviousSteps + 2)

class Narx(keras.Model):

    def __init__(self):
        super(Narx, self).__init__(name='narx')
        self.dense = keras.layers.Dense(10, input_shape=inputShape,
                                        activation=keras.activations.tanh)
        self.outputLayer = keras.layers.Dense(1, activation=keras.activations.linear)

    def call(self, inputs, training = False):
        if (training):
            x = self.dense(inputs)
            return self.outputLayer(x)
        else: # TODO: what should the network do when used for prediction
            x = self.dense(inputs)
            return self.outputLayer(x)


model = Narx()
model.compile(optimizer=keras.optimizers.RMSprop(0.001),
              loss=tf.losses.mean_squared_error,
              metrics=tf.metrics.mean_absolute_error)

# input data generation
numTsSamples = 1000

# time series to learn from
y = np.random.random((numTsSamples + numPreviousSteps + 1,))
x = np.random.random((numTsSamples,)) # exogenous input

# creation of tapped delay
data = [np.roll(y, -i)[:numTsSamples] for i in range(numPreviousSteps, -1, -1)]
data = [x] + data

# training data
data = np.stack(data, axis=1)

# expected results
yNext = y[numPreviousSteps : -1]

model = tf.keras.Sequential([
    tf.keras.layers.Dense(20, activation='tanh', input_shape=(1,)),
    tf.keras.layers.Dense(1, activation='linear')])

model.compile(
    optimizer=tf.keras.optimizers.Adam(learning_rate=0.01),
    loss=tf.keras.losses.MeanSquaredError())

model_wrapper = lm.ModelWrapper(model)

model_wrapper.compile(
    optimizer=tf.keras.optimizers.SGD(learning_rate=1.0),
    loss=lm.MeanSquaredError())

# model training
model_wrapper.fit(data, yNext)
model_wrapper.predict(y_train)

And it gives me this dimension error:

WARNING:tensorflow:Model was constructed with shape (None, 1) for input KerasTensor(type_spec=TensorSpec(shape=(None, 1), dtype=tf.float32, name='dense_20_input'), name='dense_20_input', description="created by layer 'dense_20_input'"), but it was called on an input with incompatible shape (None, 10).


ValueError Traceback (most recent call last)

in ()
64
65 # model training
---> 66 model_wrapper.fit(data, yNext)

20 frames

/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
257 ' incompatible with the layer: expected axis ' + str(axis) +
258 ' of input shape to have value ' + str(value) +
--> 259 ' but received input with shape ' + display_shape(x.shape))
260 # Check shape.
261 if spec.shape is not None and shape.rank is not None:

ValueError: Input 0 of layer dense_20 is incompatible with the layer: expected axis -1 of input shape to have value 1 but received input with shape (None, 10)

I think we're getting close. I'm sorry if I'm being too annoying.

from tf-levenberg-marquardt.

fabiodimarco avatar fabiodimarco commented on July 2, 2024

In the above code you created a Narx class instance (which has never been used) and then you replaced it with a feedforward neural network with wrong input shape.
Based on the code in the stackoverflow comment here is a version using LM trainer:

import tensorflow as tf
import numpy as np
import levenberg_marquardt as lm

num_previous_steps = 8

# input data generation
numTsSamples = 1000

# time series to learn from
y = np.random.random((numTsSamples + num_previous_steps + 1,))
x = np.random.random((numTsSamples,))  # exogenous input

# creation of tapped delay
data = [np.roll(y, -i)[:numTsSamples] for i in range(num_previous_steps, -1, -1)]
data = [x] + data

# training data
data = np.stack(data, axis=1)

# expected results
y_next = y[num_previous_steps: -1]

# model training
model = tf.keras.Sequential([
    tf.keras.layers.Dense(20, activation='tanh',
                          input_shape=(num_previous_steps + 2,)),
    tf.keras.layers.Dense(1, activation='linear')])

# build the model
_ = model(data)

model_wrapper = lm.ModelWrapper(model)
model_wrapper.compile(
    optimizer=tf.keras.optimizers.SGD(learning_rate=1.0),
    loss=lm.MeanSquaredError())

model_wrapper.fit(data, y_next, epochs=10)

out = model.predict(data)

The code that you provided is just a feedforward neural network where the input data are organized to form a NARX model.
It is missing the main part of NARX, which is the autoregressive output prediction.

from tf-levenberg-marquardt.

TobiasEl avatar TobiasEl commented on July 2, 2024

Thank you so much for helping me.

It is missing the main part of NARX, which is the autoregressive output prediction.

I have never work with NARX, so I don't really know how to implement it, just searching code or libreries. I don't know how to specify that autoregressive output.

from tf-levenberg-marquardt.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.