Skip to content

add 'save' and 'load' to the learners and periodic saving to the Runner

Bas Nijholt requested to merge saving into master

tl;dr, this MR introduces:

  • learner.save, learner.load that saves/loads only the data
  • learner.copy_from(other_learner)
  • runner.start_periodic_saving(save_kwargs, interval)

This adds saving and loading methods to the learner.

Now each learner has a save and load method that can be used to save and load only the data of a learner.

There are two ways of naming the files:

  1. Using the fname argument in learner.save(fname=...)
  2. Setting the fname attribute, like learner.fname = 'data/example.p and then learner.save()

The second way must be used when saving the learners of a BalancingLearner.

By default the resulting pickle files are compressed, to turn this off use learner.save(fname=..., compress=False)

learner = Learner2D(...)
learner.save(fname='filename.p')

learner = Learner2D(...)
learner.load(fname='filename.p')

in a BalancingLearner one can for example

def combo_fname(val):
    return '__'.join([f'{k}_{v}' for k, v in val.items()])

combos = adaptive.utils.named_product(a=[1, 2], b=[1])
learners = []
for combo in combos:
    l = Learner(partial(f, combo=combo))
    l.fname = combo_fname(combo)
    learners.append(l)
learner = BalancingLearner(learners)
learner.load(folder='data_folder')

then the next time when you add more parameters to combos, the data will be loaded for the learners that have the data saved.

This also adds leaner.copy_from(learner2) to get a new learner with the same data where one (for example) uses different bounds or loss_per_interval.

Finally, it adds periodic saving while running with the Runner.

I've tested this over the last year in adaptive_tools and seems to work nicely for me.

Still need to

  • test with BalancingLearner
  • test with DataSaver
  • add doc-strings
  • add examples to the notebook
Edited by Bas Nijholt

Merge request reports