NVIDIA Warp - Tesseract Wrapper (+ Auto-Diff)

Got it; there should not be any ambiguity there: inside a Tesseract you specify how that component is to be differentiated when you implement the jacobian, jacobian_vector_product, and vector_jacobian_product endpoints.

You could do that without any AD at all, for example by implementing some finite differencing on your own, you could have some custom code for that which uses your AD framework of choice, or (for pytorch and jax) you can just use our “recipes” that can create the endpoints mentioned above automatically from your implementation of apply (see for instance this JAX example, where the user only needed to implement apply and the implementation of vjp/jvp and so on was pre-filled via a template by tesseract-core).

If you don’t implement any differentiable endpoint, tesseract-core will not do anything for you as a fallback, and that Tesseract will simply be non differentiable.

1 Like