Giter Site home page Giter Site logo

Comments (5)

bionicles avatar bionicles commented on May 12, 2024 1

i'd like to use keras-tuner but im unclear on this codebase and there's zero documentation on how to use the callback hooks. custom != complicated, i'll just use optuna for now

from keras-tuner.

omalleyt12 avatar omalleyt12 commented on May 12, 2024

@bionicles this is definitely on the roadmap, on the Keras Tuner side as well as making it easier on the Keras side to plug custom training step into compile/fit

On the Keras Tuner side, for complicated use cases the current best way imo is to override the Tuner class and replace calls to fit with calls to your custom training logic (and make sure to report the results to the Oracle and call the callback hooks as necessary)

from keras-tuner.

omalleyt12 avatar omalleyt12 commented on May 12, 2024

@bionicles There will be more documentation ahead of the 1.0 launch (coming soon!). The codebase is still in active development, so some things for advanced use cases are still internal but will be exposed externally soon. For now, here's an example of how to implement a custom training loop that should remain stable:

class MyTuner(kerastuner.engine.Tuner):
  def run_trial(self, trial, x, y, val_x, val_y):
        model = self.hypermodel.build(trial.hyperparameters)
        for epoch in range(10):
          self.on_epoch_begin(trial, model, epoch, logs={})
          for batch in range(100):
             self.on_batch_begin(trial, model, batch, logs={})
             outputs = model(x)
             loss = myloss(outputs, y)
             self.on_batch_end(trial, model, batch, logs={'loss': loss})
          val_loss = ...
          self.on_epoch_end(trial, model, epochs, logs={'val_loss': val_loss})

def build_model(hp):
   num_layers = hp.Int('num_layers', 1, 10)
   ...
   return model

tuner = MyTuner(
  oracle=kerastuner.tuners.randomsearch.RandomSearchOracle(
    objective='val_loss',
    max_trials=30),
  hypermodel=build_model)
tuner.search(x, y, val_x, val_y)

from keras-tuner.

omalleyt12 avatar omalleyt12 commented on May 12, 2024

I'll update this thread again when we have documentation available for best practices for these use cases

from keras-tuner.

omalleyt12 avatar omalleyt12 commented on May 12, 2024

Closing as there is now a pending PR for a tutorial on custom training loops: #136

from keras-tuner.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.