Hi,

I am wondering if there are ways to combine autograd and numerical gradients with finite difference method.

Specifically, for Modified Bessel Function, I also want to compute the gradient with respect to $\nu$. It’s simple to compute its gradient with finite difference while hard with autograd. How to wrap finite difference as an autograd function and use it in `torch.nn.Module`

subclasses?

Thanks!