An R package for computing fast and accurate numerical derivatives.
In the past, I was using numDeriv to compute numerical gradients. However, the results were not stable for some function, and I could not investigate the source of this instability. Different step sizes yielded different results. Small step sizes were sometimes better, sometimes worse.
The pnd
package was designed to offer a comprehensive
tool-kit containing popular algorithms for finite differences, numerical
gradients, Jacobians, and Hessians.
Optimal step sizes and parallel evaluation of numerical derivatives translate directly to faster numerical optimisation and statistical inference.
This package has numDeriv
-compatible syntax. Simply
replace the first letter of numDeriv
commands with a
capital one to get the improved commands: Grad
,
Jacobian
, and Hessian
.
Here is how to compute the gradient of
f(x) = sum(sin(x))
at the point
x = (1, 2, 3, 4)
.
<- function(x) sum(sin(x))
f <- 1:4
x names(x) <- c("Jan", "Feb", "Mar", "Apr")
::grad(f, x)
numDeriv#> 0.5403023 -0.4161468 -0.9899925 -0.6536436
::Grad(f, x)
pnd#> Jan Feb Mar Apr
#> 0.5403023 -0.4161468 -0.9899925 -0.6536436
#> attr(,"step.size")
#> Jan Feb Mar Apr
#> 6.055454e-06 1.211091e-05 1.816636e-05 2.422182e-05
#> attr(,"step.size.method")
#> "default"
The output contains diagnostic information about the chosen step
size. Our function preserved the names of the input argument, unlike
grad
.
The default step size in many implementations is proportional to the
argument value, and this is reflected in the default output. Should the
user desire a fixed step size, this can be easily achieved with an extra
argument named h
:
::Grad(f, x, h = c(1e-5, 1e-5, 1e-5, 2e-5))
pnd#> Jan Feb Mar Apr
#> 0.5403023 -0.4161468 -0.9899925 -0.6536436
#> attr(,"step.size")
#> Jan Feb Mar Apr
#> 1e-05 1e-05 1e-05 2e-05
attr(,"step.size.method")
#> "user-supplied"
Finally, it is easy to request an algorithmically chosen optimal step
size – here is how to do it with the Stepleman–Winarsky (1979) rule,
named "SW"
, that works well in practice:
::Grad(f, x, h = "SW")
pnd#> Jan Feb Mar Apr
#> 0.5403023 -0.4161468 -0.9899925 -0.6536436
#> attr(,"step.size")
#> Jan Feb Mar Apr
#> 5.048535e-06 1.000000e-05 7.500000e-06 1.000000e-05
#> attr(,"step.size.method")
#> "SW"
Extensive diagnostics and error estimates can be requested at any
time: pnd::Grad(f, x, h = "SW", report = 2)
will contain
the step-search path for each coordinate of x
. Use
report = 0
to produce just the numerical gradient without
any attributes, like numDeriv::grad
would.
This package is supported by 3 vignettes:
The following articles provide the theory behind the methods implemented in this package:
This package currently exists only on GitHub. To install it, run the following two commands:
install.packages("devtools")
::install_github("Fifis/pnd") devtools
To load this package, include this line in the code:
library(pnd)
This package is almost dependency-free; the parallel
library belongs to the base
group and is included in most R
distributions.
This software is released under the free/open-source EUPL 1.2 licence.