πŸŽ‰ Tesseract Core is now public

:point_right: Visit on GitHub | Read the docs | Talk to the community | Simple examples | Expert showcase :point_left:

Today we are releasing Tesseract Core to the world, a free and open source application to define, build, and execute Tesseracts.

At Pasteur Labs, we are continuously faced with the challenge to bridge the gap between research-grade code and production systems. Tesseracts accelerate this process significantly, allowing users to package machine learning models, differentiable solvers, physical simulators, and much more, into components that are readily usable, effortless to deploy in production, and expose a uniform and self-validating API – all without the need to write excessive boilerplate.

In particular, Tesseracts provide native support for endpoints that interface with automatic differentiation frameworks (including but not limited to JAX and PyTorch). This allows users to propagate gradients across entire Tesseract pipelines, enabling Differentiable Physics Programming (DPP) at scale. We believe this is an important step towards making AI-simulator hybrid systems usable in real-world scenarios.

For example, Tesseracts enable researchers to perform gradient-based, adaptive optimization involving multiple differentiable components in a pipeline, even in settings where components are executed on different machines (for example in the cloud, or on a GPU compute cluster), with only a few lines of code:

rosenbrock-opt
Optimizing a function using gradient information exposed via Tesseract. Full example

See code example
import numpy as np
import scipy.optimize
from tesseract_core import Tesseract

# connect to remote Tesseract
tesseract = Tesseract(url="http://localhost:8000")

# define optimized functions that dispatch to Tesseract
def rosenbrock(x: np.ndarray) -> float:
    output = tesseract.apply({"x": x[0], "y": x[1]})
    return output["result"].item()

def rosenbrock_gradient(x: np.ndarray) -> np.ndarray:
    output = tesseract.jacobian(
        {"x": x[0], "y": x[1]}, jac_inputs=["x", "y"], jac_outputs=["result"]
    )
    return np.array([output["result"]["x"], output["result"]["y"]])

# perform optimization
trajectory = []
result = scipy.optimize.minimize(
    rosenbrock,
    x0=np.array([-0.5, 1.0]),
    method="BFGS",
    jac=rosenbrock_gradient,
    options={"disp": True},
    callback=lambda xs: trajectory.append(xs),
)

Tesseracts separate concerns between implementing, building, and executing components, and enable computations and optimization in diverse contexts; such as plasma physics simulations running on cloud GPUs, executed independently from the optimizer and other components:

tesseract-plasma-opt
Fitting a Thomson scattering diagnostic model using a second order optimization method. Full example

Ultimately, Tesseracts really shine in complex optimization pipelines, where multiple differentiable components need to interact with each other. This allows users to build non-trivial pipelines of differentiable components, without worrying about where and how to deploy them to compute resources and handling communications between wildly different pieces of software.


Multiple Tesseracts collaborating on a complex gradient computation (backpropagation) for orders of magnitude more efficient optimization, design of experiments, optimal control, and inverse design.

The release of Tesseract Core marks the beginning of the Tesseract ecosystem. There are many more tutorials, case studies, toolkits, and applications we can’t wait to release to the world. In the meantime, head over to GitHub to get started with Tesseracts, and let us know what you think!

5 Likes