Back to Blog

Physics-Informed Neural Networks: When Deep Learning Learns the Laws of Nature

There is a fundamental tension at the heart of applying machine learning to physical sciences.

On one side: classical physics solvers — finite difference, finite element, spectral element methods — that are accurate, interpretable, and grounded in centuries of mechanics. On the other: neural networks that are flexible, fast at inference, and powerful pattern recognizers, but that know nothing about Newton's laws, elasticity, or fault friction unless we explicitly tell them.

For years, the ML-in-geophysics literature defaulted to one of two extremes: either run a classical solver and use ML only as a post-processing tool, or collect large labeled datasets and train a purely data-driven model that treats the physics as a black box.

Physics-Informed Neural Networks (PINNs) break this dichotomy. They embed governing equations — wave equations, elastodynamic PDEs, friction laws — directly into the neural network's loss function. The result is a model that must satisfy the physics of the problem during training, not just fit the data.

This is not a minor algorithmic tweak. It changes what these models can do, how much data they need, and crucially, whether we can trust what they learn.

The Core Idea: Physics as a Penalty

A standard supervised neural network learns a mapping uθ:XYu_\theta: \mathcal{X} \to \mathcal{Y} by minimizing a data loss:

Ldata=1Ni=1Nuθ(xi)ui2\mathcal{L}_\text{data} = \frac{1}{N} \sum_{i=1}^{N} \| u_\theta(\mathbf{x}_i) - u_i \|^2

A PINN adds a second term — the physics residual — that penalizes violations of the governing PDE:

Lphysics=1Nrj=1NrN[uθ(xj,tj)]f(xj,tj)2\mathcal{L}_\text{physics} = \frac{1}{N_r} \sum_{j=1}^{N_r} \left\| \mathcal{N}\left[ u_\theta(\mathbf{x}_j, t_j) \right] - f(\mathbf{x}_j, t_j) \right\|^2

where N[]\mathcal{N}[\cdot] is a differential operator encoding the physical law — the Laplacian, the wave operator, the elastodynamic equations — evaluated at a set of randomly sampled collocation points (xj,tj)(\mathbf{x}_j, t_j).

The full training objective is then:

L=Ldata+λLphysics+LBC\mathcal{L} = \mathcal{L}_\text{data} + \lambda \, \mathcal{L}_\text{physics} + \mathcal{L}_\text{BC}

where LBC\mathcal{L}_\text{BC} penalizes violations of boundary and initial conditions, and λ\lambda is a weighting hyperparameter.

The key insight: the network learns to satisfy the physics everywhere in the domain, not just at labeled training points. Derivatives are computed exactly via automatic differentiation — no finite differences, no discretization error.

This was formalized by Raissi, Perdikaris, and Karniadakis in their landmark 2019 Journal of Computational Physics papers, and the field has since exploded.

Why Geophysics Is a Natural Home for PINNs

Geophysics faces a combination of challenges that makes it nearly ideal for physics-informed approaches:

  1. Sparse observations: We cannot place sensors inside a fault zone 10 km underground. We observe the Earth's surface and infer the interior — a deeply ill-posed inverse problem.
  2. Expensive simulations: High-fidelity dynamic rupture or seismic wave simulations take hours to days per run on HPC clusters.
  3. Well-characterized physics: We have robust governing equations — the elastodynamic wave equation, friction laws, the eikonal equation — derived from decades of geomechanics research.
  4. Uncertainty everywhere: Stress states, frictional properties, and fault geometry are uncertain at depth.

PINNs address all four simultaneously: they can be trained on sparse surface observations, they are fast at inference once trained, they encode the known physics, and they enable principled uncertainty quantification.

Seismic Wave Propagation

The seismic wave equation — governing how elastic energy propagates through the Earth — is one of the most important PDEs in geophysics:

ρ(x)2ut2=σ(u)+f(x,t)\rho(\mathbf{x}) \frac{\partial^2 \mathbf{u}}{\partial t^2} = \nabla \cdot \boldsymbol{\sigma}(\mathbf{u}) + \mathbf{f}(\mathbf{x}, t)

where u\mathbf{u} is the displacement field, σ\boldsymbol{\sigma} is the stress tensor related to u\mathbf{u} through the elastic constitutive law, ρ\rho is density, and f\mathbf{f} is the source term.

Classical solvers like SPECFEM discretize this on a mesh. They are accurate but require the velocity model ρ(x)\rho(\mathbf{x}) to be known in advance.

Rasht-Behesht et al. (2022) published the first comprehensive study of PINNs for seismic wave propagation and full waveform inversion (FWI) in the Journal of Geophysical Research: Solid Earth. Their key finding: for the forward problem, SPECFEM remains more efficient; but for inversion — recovering ρ(x)\rho(\mathbf{x}) from observed waveforms — PINNs are competitive and naturally handle the absorbing boundary conditions that classical FWI struggles with [1].

SeismicNet (Ren et al., 2023) extended this to semi-infinite domains — physically realistic because the Earth has no outer boundary — using temporal domain decomposition to make PINN training scalable [2]. The code is openly available and has become a benchmark in the field.

For non-smooth, geologically realistic media (sharp velocity contrasts at rock boundaries), FF-PINN (2024) addresses the "spectral bias" problem in standard coordinate networks — where MLPs prefer low-frequency solutions — by incorporating Fourier features into the network architecture, enabling accurate modeling of high-frequency wavefields in the Marmousi and Overthrust benchmark models [3].

Fault Mechanics and Earthquake Physics

This is where PINNs become most directly relevant to earthquake science — and where I find the parallels to my own research most striking.

In my 2019 paper with Eric Daub, we used Random Forests and ANNs as surrogate models: we ran 1,600 dynamic rupture simulations governed by the Linear Slip-Weakening Law and trained ML models to predict the outcome (propagation vs. arrest) without running the physics solver. The models achieved >81% accuracy and reduced inference time from hours to under a second.

PINNs take this a step further. Instead of treating the physics solver as a black box and learning its input-output mapping, PINNs directly encode the governing equations of fault mechanics into training.

Crustal Deformation

Okazaki et al. (2022) published a PINN approach for crustal deformation governed by the elastostatic equations in Nature Communications [4]. A key challenge they overcame: PINNs naturally struggle with displacement discontinuities at the fault plane. They solved this with a polar coordinate transformation near the fault tip, enabling accurate modeling of the stress singularity.

Their 2025 follow-up in JGR: Machine Learning and Computation extended this to in-plane (normal and reverse fault) geometries with arbitrary slip distributions — and demonstrated that the PINN can invert for fault slip from surface GPS/InSAR observations [5].

Fault Friction: The Rate-and-State Law

Perhaps the most physically rich recent development is the application of PINNs to rate-and-state friction — the friction law that governs slow creep, stick-slip, and the nucleation of dynamic rupture.

The rate-and-state friction law couples slip velocity VV and a state variable θ\theta (representing the evolving contact population on the fault surface):

μ(V,θ)=μ0+aln(VV0)+bln(V0θDc)\mu(V, \theta) = \mu_0 + a \ln\left(\frac{V}{V_0}\right) + b \ln\left(\frac{V_0 \theta}{D_c}\right)

dθdt=1VθDc\frac{d\theta}{dt} = 1 - \frac{V\theta}{D_c}

These equations are encoded directly into the PINN loss alongside the elastodynamic equations. Ren, Karniadakis et al. (2024) demonstrated that this multi-network PINN framework can solve both forward and inverse problems for rate-and-state faults in Computer Methods in Applied Mechanics and Engineering [6].

More strikingly, Borate et al. (2023) used PINNs trained on passive acoustic emissions from laboratory shear experiments to predict laboratory earthquakes — and found the physics-constrained model significantly outperformed purely data-driven approaches when training data was limited [7]. The physics was not decoration. It was load-bearing.

Seismic Tomography and Inversion

The eikonal equation governs seismic traveltime:

T(x)=1v(x)\left\| \nabla T(\mathbf{x}) \right\| = \frac{1}{v(\mathbf{x})}

where T(x)T(\mathbf{x}) is the first-arrival traveltime and v(x)v(\mathbf{x}) is the seismic velocity field.

Encoding this into a PINN loss enables eikonal tomography without a mesh or ray tracing. Chen et al. (2022) applied this to real field data — Rayleigh wave phase velocity across the Tibetan Plateau — and recovered results competitive with classical eikonal tomography from the ChinArray II dataset [8].

For exploration seismology, FWIGAN (Yang & Ma, 2023) combined a physics-based wave equation generator with a Wasserstein GAN critic to perform full waveform inversion with no labeled training data — outperforming traditional FWI on three benchmark geological models and proving robust to poor initial models and data noise [9].

PINNs vs. Classical Solvers: When to Use Which

A direct benchmark published in the IMA Journal of Applied Mathematics (2024) addressed the obvious question: can PINNs beat the finite element method? [10]

The honest answer: not for smooth forward problems. FEM remains more computationally efficient and accurate for well-posed, smooth forward simulations.

But PINNs win in specific scenarios:

Problem TypeClassical SolverPINN
Forward simulation (smooth)Fast, accurateSlower, competitive
Inverse problems (sparse data)Ill-posed, needs regularizationNaturally regularized by physics
Irregular geometriesMeshing requiredMeshless
Uncertainty quantificationEnsemble methods neededProbabilistic extensions available
Scarce labeled dataStrugglesCompensates with physics
Parameter recovery from observationsIterative, expensiveJoint inversion in one pass

The emerging consensus (Schuster, Chen & Feng review, 2024) is that PINNs are not replacements for physics solvers — they are a new tool for a different class of problems, principally inverse problems and data-scarce regimes [11].

What Comes Next

The field is moving fast. A few developments I am watching closely:

Neural Operators. The Fourier Neural Operator (FNO) and its variants learn mappings between function spaces rather than point-to-point mappings. Lehmann et al. (2024) trained a Factorized FNO on 30,000 3D elastic seismic simulations and achieved real-time 3D surface wavefield prediction — something previously impossible [12]. Inference takes milliseconds versus hours.

Foundation Models for Seismology. The Annual Reviews survey by Mousavi & Beroza (2023) explicitly identifies foundation model pre-training on large seismic waveform archives as a near-term frontier [13]. We are likely 2–3 years away from a "Seismic-GPT" trained on global continuous waveform data.

Uncertainty-Aware PINNs. Probabilistic extensions (P-PINNs) now provide full posterior distributions over recovered physical parameters — critical for seismic hazard assessment where risk decisions depend on the uncertainty, not just the best estimate [14].

From Lab to Field. PINNs trained on laboratory stick-slip data (Borate et al.) need to be validated against real fault zone monitoring data — a major open challenge as distributed acoustic sensing (DAS) arrays scale up.


The deeper point is this: the physics has not become less important as machine learning has grown more powerful. If anything, it has become more important — because it is the one thing that prevents these models from being arbitrary curve fits to noisy measurements.

When a PINN correctly recovers fault friction parameters from surface observations, or correctly identifies that shear stress drives rupture propagation (as our random forest did in 2019), it is not just fitting data. It is encoding mechanics.

That is a genuinely different kind of model.


References

  1. Rasht-Behesht, M., Huber, C., Shukla, K., & Karniadakis, G.E. (2022). Physics-Informed Neural Networks (PINNs) for Wave Propagation and Full Waveform Inversions. Journal of Geophysical Research: Solid Earth, 127(5), e2021JB023120. https://doi.org/10.1029/2021JB023120

  2. Ren, P., Rao, C., Chen, S., Wang, J.-X., Sun, H., & Liu, Y. (2023). SeismicNet: Physics-Informed Neural Networks for Seismic Wave Modeling in Semi-Infinite Domain. Computer Physics Communications, 295, 109010. https://doi.org/10.1016/j.cpc.2023.109010

  3. Zou, J. et al. (2024). Physics-Informed Neural Networks with Fourier Features for Seismic Wavefield Simulation in Time-Domain Nonsmooth Complex Media. IEEE Geoscience and Remote Sensing Letters. https://arxiv.org/abs/2409.03536

  4. Okazaki, T., Ito, T., Hirahara, K., & Ueda, N. (2022). Physics-Informed Deep Learning Approach for Modeling Crustal Deformation. Nature Communications, 13, 7092. https://doi.org/10.1038/s41467-022-34922-1

  5. Okazaki, T. et al. (2025). Physics-Informed Deep Learning for Forward and Inverse Modeling of Inplane Crustal Deformation. Journal of Geophysical Research: Machine Learning and Computation. https://doi.org/10.1029/2024JH000474

  6. Ren, P., Okazaki, T., Tobita, M., et al. (2024). Physics-Informed Deep Learning of Rate-and-State Fault Friction. Computer Methods in Applied Mechanics and Engineering, 430, 117217. https://doi.org/10.1016/j.cma.2024.117217

  7. Borate, P., Riviere, J., Marone, C., et al. (2023). Using a Physics-Informed Neural Network and Fault Zone Acoustic Monitoring to Predict Lab Earthquakes. Nature Communications, 14, 3693. https://doi.org/10.1038/s41467-023-39377-6

  8. Chen, Y. et al. (2022). Eikonal Tomography with Physics-Informed Neural Networks: Rayleigh Wave Phase Velocity in the Northeastern Margin of the Tibetan Plateau. Geophysical Research Letters, 49(21), e2022GL099053. https://doi.org/10.1029/2022GL099053

  9. Yang, F. & Ma, J. (2023). FWIGAN: Full-Waveform Inversion via a Physics-Informed Generative Adversarial Network. Journal of Geophysical Research: Solid Earth, 128. https://doi.org/10.1029/2022JB025493

  10. Grossmann, T.G. et al. (2024). Can Physics-Informed Neural Networks Beat the Finite Element Method? IMA Journal of Applied Mathematics, 89(1), 143. https://doi.org/10.1093/imamat/hxae011

  11. Schuster, G.T., Chen, Y., & Feng, S. (2024). Review of Physics-Informed Machine-Learning Inversion of Geophysical Data. Geophysics, 89(6), T337–T356. https://doi.org/10.1190/geo2023-0615.1

  12. Lehmann, F., Gatti, F., Bertin, M., & Clouteau, D. (2024). 3D Elastic Wave Propagation with a Factorized Fourier Neural Operator. Computer Methods in Applied Mechanics and Engineering. https://arxiv.org/abs/2304.10242

  13. Mousavi, S.M. & Beroza, G.C. (2023). Machine Learning in Earthquake Seismology. Annual Review of Earth and Planetary Sciences, 51, 105–129. https://doi.org/10.1146/annurev-earth-071822-100323

  14. Thibaut, R. et al. (2024). Probabilistic Physics-Informed Neural Network for Seismic Petrophysical Inversion. Geophysics. https://doi.org/10.1190/geo2023-0214.1

  15. Ahamed, S., & Daub, E.G. (2019). Application of Machine Learning Techniques to Predict Rupture Propagation and Arrest in 2-D Dynamic Earthquake Simulations. Geophysical Journal International, 216(3), 1977. https://doi.org/10.1093/gji/ggy500