⚗️ Physics-Informed Bayesian Optimization Platform
Design experiments efficiently by combining physics models with Gaussian Process surrogates. The physics model acts as a structured prior (GP mean function), and the GP learns the residual — dramatically reducing the number of experiments needed.
Backends: BoTorch · GPyTorch · AX · BoFire
Acquisition Function
3 30
5 100
1 5
0.001 1
How it works
Traditional Bayesian optimisation uses a GP with a flat (constant) mean. This platform replaces the mean with a physics model:
$$f(x) = \phi(x) + \varepsilon(x)$$
where $\phi(x)$ is the physics model and $\varepsilon(x) \sim \mathcal{GP}(0,, k(x,x'))$ captures the residual (model discrepancy + noise).
Benefits
- Sample efficiency — physics captures the trend; the GP only learns small deviations.
- Extrapolation — physics provides reasonable predictions outside observed data.
- Constraint awareness — physical constraints steer the search toward feasible regions.
- Graceful degradation — works physics-only (no data), hybrid, or pure GP.
Surrogate mode selection
| Data | Physics model | Mode |
|---|---|---|
| None | ✓ | physics_only |
| < 20 | ✓ | physics_as_mean |
| 20-50 | ✓ | weighted_ensemble |
| Any | ✗ | gp_only |
Stack
PyTorch · GPyTorch · BoTorch · AX Platform · BoFire
Built by Plinity — infinite recyclable polymers