Comments (4)
The training and validation metrics and loss values are all saved into the status
when you do
status = prob_model.train(...)
Perhaps is this what you need? I am not sure if I understand the use case of going inside the trainer yet.
from fortuna.
If it is only accuracy, loss, or other metrics that only depend on the predictions and data, you can do this in FitMonitor
. Something like this:
from fotuna.prob_model import FitConfig, FitMonitor
from fortuna.metric.classification import accuracy
fit_config = FitConfig(monitor=FitMonitor(metrics=(accuracy,)))
You can also pass arbitrary user-defined metrics, as long as it has the same signature as accuracy
. The training loss should be logged automatically. The validation loss will be logged if you pass validation data and set the logger to INFO level.
About the callback, yes, look at FitCallback
. This is also part of the FitConfig
objects. You can do something like
fit_config = FitConfig(
monitor=FitMonitor(metrics=(accuracy,)),
callbacks=[FitCallback(...)]
)
Look here to check how FitCallback
works. You can set a custom callback either at train_epoch_end
, or train_epoch_start
, or train_step_end
. Anyway, if all you want is accuracy and loss, you won't need this.
from fortuna.
Okay great, thank you! I see that the loss/accuracy are printed during training, but I'm unsure how I can go inside the training loop and, for every epoch, do something like wandb.log({"val_loss": val_loss"})
, etc.
Any ideas?
from fortuna.
@gianlucadetommaso yes this is good!
from fortuna.
Related Issues (17)
- Issue templates
- bug: `ConcretizationTypeError` when trying to use `prob_model.predictive()` HOT 5
- bug: SWAG's `state.mean.size` is `1` leading to `TypeError: len() of unsized object`. HOT 24
- bug: restore_checkpoint_path doesn't seem to work. HOT 2
- docs: Convert notebooks to py files.
- bug: MNIST classification tutorial not running HOT 6
- feat: mondrian conformal prediction HOT 2
- Installation HOT 18
- bug: `FitOptimizer.__init__()` got an unexpected keyword argument 'freeze_fun' HOT 2
- Example with Time Series Data ? HOT 6
- Fortuna install google colab HOT 3
- Documentation Calib Regressor HOT 10
- Documentation: Inconsistent outputs HOT 3
- Bug: Init doesn't work when the input data is a dictionary HOT 6
- feat: Keep track of `state_dict` corresponding to best model according to validation accuracy HOT 5
- bug: `model.__call__()` got an unexpected keyword argument 'train' when training custom model class. HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from fortuna.