Skip to content
Snippets Groups Projects
Commit 7a4f44eb authored by Bas Nijholt's avatar Bas Nijholt
Browse files

add loss_per_simplex example

parent 384b6c67
No related branches found
No related tags found
No related merge requests found
Pipeline #21718 passed
...@@ -320,6 +320,27 @@ learner = adaptive.LearnerND(ring, bounds=[(-1, 1), (-1, 1)]) ...@@ -320,6 +320,27 @@ learner = adaptive.LearnerND(ring, bounds=[(-1, 1), (-1, 1)])
runner = Runner(learner, goal) runner = Runner(learner, goal)
``` ```
Again, like the `Learner1D`, it is possible to specify a custom loss function.
For example, the loss function used to find the iso-line in Fig. @fig:isoline (b) is
```python
def isoline_loss_function(level, sigma, priority):
from adaptive.learner.learnerND import default_loss
def gaussian(x, mu, sigma):
return np.exp(-(x - mu) ** 2 / sigma ** 2 / 2)
def loss(simplex, values, value_scale):
values = np.array(values)
dist = abs(level * value_scale - values).mean()
L_default = default_loss(simplex, values, value_scale)
L_dist = priority * gaussian(dist, 0, sigma)
return L_dist + L_default
return loss
loss_per_simplex = isoline_loss_function(0.1, 0.4, 0.5)
```
#### The BalancingLearner can run many learners simultaneously. #### The BalancingLearner can run many learners simultaneously.
Frequently, more than one function (learner) needs to run at once, to do this we have implemented the `BalancingLearner`, which does not take a function, but a list of learners. Frequently, more than one function (learner) needs to run at once, to do this we have implemented the `BalancingLearner`, which does not take a function, but a list of learners.
This learner internally asks all child learners for points and will choose the point of the learner that maximizes the loss improvement; thereby, it balances the resources over the different learners. This learner internally asks all child learners for points and will choose the point of the learner that maximizes the loss improvement; thereby, it balances the resources over the different learners.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment