Skip to content
Snippets Groups Projects

Resolve "(Learner1D) add possibility to use the direct neighbors in the loss"

Merged Jorn Hoofwijk requested to merge 119-add-second-order-loss-to-adaptive into master
Compare and Show latest version
1 file
+ 8
6
Compare changes
  • Side-by-side
  • Inline
@@ -149,11 +149,12 @@ curvature. To do this, you need to tell the learner to look at the curvature
by specifying ``loss_per_interval``.
.. jupyter-execute::
from adaptive.learner.learner1D import (get_curvature_loss,
uniform_loss,
default_loss)
curvature_loss = curvature_loss()
learner = adaptive.Learner1D(f, bounds=(-1, 1), loss_per_interval=curvature_loss
curvature_loss = get_curvature_loss()
learner = adaptive.Learner1D(f, bounds=(-1, 1), loss_per_interval=curvature_loss)
runner = adaptive.Runner(learner, goal=lambda l: l.loss() < 0.01)
.. jupyter-execute::
@@ -181,13 +182,14 @@ including nearest neighboring intervals in this plot: We will look at 100 points
learner_2 = adaptive.Learner1D(ff, (-1, 1), loss_per_interval=curvature_loss)
npoints_goal = lambda l: l.npoints >= 100
adaptive.BlockingRunner(learner_h, goal=npoints_goal)
adaptive.BlockingRunner(learner_1, goal=npoints_goal)
adaptive.BlockingRunner(learner_2, goal=npoints_goal)
# adaptive.runner.simple is a non parallel blocking runner.
adaptive.runner.simple(learner_h, goal=npoints_goal)
adaptive.runner.simple(learner_1, goal=npoints_goal)
adaptive.runner.simple(learner_2, goal=npoints_goal)
(learner_h.plot().relabel('homogeneous')
+ learner_1.plot().relabel('euclidean loss')
+ learner_2.plot().relabel('curvature loss'))
+ learner_2.plot().relabel('curvature loss')).cols(2)
More info about using custom loss functions can be found
in :ref:`Custom adaptive logic for 1D and 2D`.
Loading