1. Motivation and Scope

Physics‑informed machine learning (PIML) addresses a key limitation of purely data‑driven neural networks in scientific and engineering applications: lack of physical consistency and poor generalization under data scarcity. Many real‑world systems-fluid flows, solid mechanics, thermodynamics, and industrial machines-are governed by known (or partially known) physical laws, yet measurements are expensive, noisy, or incomplete. PIML aims to integrate first‑principles physics and data‑driven learning into a unified modeling paradigm, combining the interpretability and extrapolation capability of physics‑based models with the flexibility and efficiency of neural networks.

See Braga-Neto 2024

2. Taxonomy of Physics‑Informed Neural Networks

A widely adopted classification, articulated by Faroughi et al. and echoed across the other sources, distinguishes four main families.

2.1 Physics‑Guided Neural Networks (PgNNs)

Physics‑guided neural networks use standard supervised learning, but the training data are generated or curated to respect physical laws (e.g., from numerical solvers or experiments). The physics is implicit in the data, not explicitly enforced.

Strengths:

  • Simple to implement
  • Effective as fast surrogate models for expensive simulations

Limitations:

  • Require large, representative datasets
  • May violate physics outside the training domain
  • Limited extrapolation capability

2.2 Physics‑Informed Neural Networks (PINNs)

Physics‑informed neural networks incorporate governing equations, boundary conditions, and conservation laws directly into the loss function. Using automatic differentiation, the network is penalized for violating the underlying differential equations.

Strengths:

  • Data‑efficient learning
  • Physically consistent predictions
  • Applicable to forward and inverse problems

Limitations:

  • Training instability for stiff or multi‑scale PDEs
  • Sensitivity to loss‑term weighting
  • Typically instance‑specific (retraining required for new conditions)

PINNs are now widely used in fluid mechanics, solid mechanics, porous media, and inverse parameter identification.

2.3 Physics‑Encoded Neural Networks (PeNNs)

Physics‑encoded neural networks go beyond loss‑based constraints and embed physics directly into the network architecture. Known physical relationships are “hard‑wired” into the model structure, often via custom layers or expression‑tree‑derived networks.

For instance on WO2025051392A1 we provided a concrete example: a hybrid architecture combining

  • a physics‑based neural network derived from an analytical expression tree, and
  • a data‑driven neural network that learns residual discrepancies.

Strengths:

  • Guaranteed physical structure by design
  • Improved robustness and interpretability
  • Reduced risk of unphysical solutions

Limitations:

  • More complex architectures
  • Requires explicit formulation of known physics

2.4 Neural Operators

Neural operators (e.g., DeepONet, Graph Neural Operator, Fourier Neural Operator) learn mappings between function spaces rather than pointwise input-output relations. They approximate solution operators of PDEs and are often resolution‑invariant.

Below GNO:

Strengths:

  • Strong generalization across discretizations
  • Efficient for real‑time inference and digital twins

Limitations:

  • Data‑hungry
  • Reduced interpretability compared to equation‑based models

3. Hybrid and Gray‑Box Modeling

Hybrid PIML approaches combine physics‑based and data‑driven components in parallel or residual configurations. A physical model captures dominant behavior, while a neural network learns unmodeled effects or corrections.

Such gray‑box models are particularly attractive for industrial systems where physics is known but incomplete.

4. Applications

Across the reviewed documents, PIML has been applied to:

  • Fluid and solid mechanics: surrogate CFD/FEA models, turbulence closure
  • Industrial systems: virtual sensors, control systems, digital twins
  • Environmental and geosciences: porous media flow, climate modeling
  • Materials science: constitutive modeling and inverse design

5. Challenges and Outlook

Despite rapid progress, several challenges remain:

  • Stable and scalable training for multi‑physics, multi‑scale problems
  • Automated discovery and encoding of physical structure
  • Balancing data, physics, and computational cost

Future research is converging toward hybrid architectures that combine PINNs, PeNNs, and neural operators, supported by advances in differentiable programming and scientific ML tooling.

6. Conclusion

Physics‑informed machine learning represents a paradigm shift from purely data‑driven modeling toward knowledge‑guided intelligence. By embedding physics into neural networks-through data, loss functions, architectures, or operators-PIML enables models that are more accurate, data‑efficient, interpretable, and trustworthy. As tools and methodologies mature, PIML is poised to become a cornerstone of next‑generation scientific computing and industrial AI.

Based on: