Skip to content

issue a warning instead of raising an exception

Bas Nijholt requested to merge notebook_warning into master

I am running into this problem when I am working in the notebook and use a ipyparallel.Client within a zmq.Context().

For context, I want to run many more learners than a runner can handle.

For example, say 10000 learners on 200 cores, that means that if generating a new point is 50 ms, a learner can only ask for a new point every 10 seconds. If it's done quicker, the cores are waiting for new points.

The way I do that now is something along the lines of:

def run_all_the_things(learners_params):
    import ipyparallel
    import zmq
    import adaptive
    import toolz

    client = ipyparallel.Client(context=zmq.Context())
    
    learners = [adaptive.Learner2D(**p) for p in learners_params]
    
    learner = adaptive.BalancingLearner(learners)
    runner = adaptive.Runner(learner, goal=lambda learner: all(l.n > 100 for l in learner.learners))
    runner.run_sync()
    return [l.data for l in learner.learners]

from concurrent.futures import ProcessPoolExecutor
ex = ProcessPoolExecutor(len(i_learners))

futs = [ex.submit(run_all_the_things, i_learner) for i_learner in i_learners]

when doing this, this error is raised:

        if in_ipynb() and not self.ioloop.is_running():	
            raise RuntimeError('Run adaptive.notebook_extension() to use '	
                               'the Runner in a Jupyter notebook.')
Edited by Bas Nijholt

Merge request reports