Skip to content
Snippets Groups Projects
Commit 2ff52eb9 authored by Bas Nijholt's avatar Bas Nijholt
Browse files

write about cquad and mathematica

parent 3f18f0fa
No related branches found
No related tags found
No related merge requests found
Pipeline #21084 passed
......@@ -121,3 +121,22 @@
year={1998},
publisher={IOP Publishing}
}
@article{gonnet2010increasing,
title={Increasing the reliability of adaptive quadrature using explicit interpolants},
author={Gonnet, Pedro},
journal={ACM Transactions on Mathematical Software (TOMS)},
volume={37},
number={3},
pages={26},
year={2010},
publisher={ACM}
}
@article{galassi1996gnu,
title={GNU scientific library},
author={Galassi, Mark and Davies, Jim and Theiler, James and Gough, Brian and Jungman, Gerard and Alken, Patrick and Booth, Michael and Rossi, Fabrice},
journal={No. Release},
volume={2},
year={1996}
}
......@@ -37,8 +37,8 @@ A simple example is greedily optimizing continuity of the sampling by selecting
For a one-dimensional function this is to (1) construct intervals containing neighboring data points, (2) calculate the Euclidean distance of each interval and assign it to the candidate point inside that interval, and finally (3) pick the candidate point with the largest Euclidean distance.
In this paper, we describe a class of algorithms that rely on local criteria for sampling, such as in the previous mentioned example.
Here we associate a *local loss* to each of the *candidate points* within an interval, and choose the points with the largest loss.
Using this loss we we can then quantify how well an interpolation of the data is describing the underlying function.
The most significant advantage of these algorithms is that they allow for easy parallelization and have a low computational overhead.
In the case of the integration algorithm the loss could just be an error estimate.
The most significant advantage of these *local* algorithms is that they allow for easy parallelization and have a low computational overhead.
#### We provide a reference implementation, the Adaptive package, and demonstrate its performance.
We provide a reference implementation, the open-source Python package called Adaptive[@Nijholt2019a], which has previously been used in several scientific publications[@vuik2018reproducing; @laeven2019enhanced; @bommer2019spin; @melo2019supercurrent].
......@@ -57,7 +57,13 @@ Here the acquired data (i.e., the observations) are used to adjust the experimen
In a typical non-adaptive experiment, decisions on how to sample are made and fixed in advance.
#### Plotting and low dimensional integration uses local sampling.
Plotting a function in between bounds requires one to evaluate the function on sufficiently many points such that when neighboring points are connected, we get an accurate description of the function values that were not explicitly calculated.
In order to minimize the number of points, one can use adaptive sampling routines.
For example, for one-dimensional functions, Mathematica implements a `FunctionInterpolation` class that takes the function, $x_\textrm{min}$, and $x_\textrm{max}$, and returns an object which sampled the function in regions with high curvature more densily.
Subsequently, we can query this object for points in between $x_\textrm{min}$ and $x_\textrm{max}$, and get the interpolated value or we can use it to plot the function without specifying a grid.
The `CQUAD` doubly-adaptive integration algorithm[@gonnet2010increasing] in the GNU Scientific Library[@galassi1996gnu] is a general-purpose integration routine which can handle most types of singularities.
In general, it requires more function evaluations than the integration routines in `QUADPACK`[@galassi1996gnu]; however, it works more often for difficult integrands.
It is doubly-adaptive because it calculates errors for each interval and can either split up intervals into more intervals or add more points to each interval.
<!-- can refer to Mathematica's implementation -->
#### PDE solvers and computer graphics use adaptive meshing.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment