Thesis: The Simulation Paradigm Shift
Parenthetical note: this is a continuation of the "Jumping Tech Trees" thesis post.
I've been thinking about whether there's a better tech tree for engineering simulation. Most physical systems - classical mechanics, fluid dynamics, heat transfer, electromagnetism, acoustics, etc can all be modeled by what are called partial differential equations (PDEs). As an example for simple 1D heat transfer:
Basically, how fast the temperature changes over time (left side, capital T = temperature, small t = time) = thermal diffusivity coefficient (k) × how much the temperature gradient is changing across the object (second derivative of T by distance into the object). Much of my work in grad school was doing what's called discretizing and numerically solving these partial differential equations. The intuition is a higher “k” (metals have higher “k”s than say a piece of wood) = faster heat transfer, and a larger temperature gradient also = faster heat transfer.
So for a toy problem like the above, one way to "discretize" this is:
Put this in matrix form and voila you have something a computer can solve (this is the sort of thing any 1st year grad student can do in week 1). How we solve these PDEs is one tech tree that's seen meager advancement over the years. Fancier grid meshing techniques, algorithms that converge faster or more reliably, maybe a bit of parallelization, etc, all representing a particular "tech tree" with the computational tools we have at hand
But the meta and perhaps more important point is that the entire idea of PDEs isn't because nature is fundamentally continuous, but because we got really good at calculus and differential equations so we model everything that way.
The question I've been mulling is whether there's a tech tree with a higher ceiling - perhaps ML based?
My thinking is that:
GPUs and machine learning is now a thing
We've since moved compute to the cloud and parallelization is easier than ever
Mesh generation might be irrelevant with ML
"The curse of dimensionality" might be rendered irrelevant with ML
But... FEM/CFD/etc aren't black boxes - stability proofs are a thing, error bounds are a thing (I hand cranked Von Neumann stability analysis back in grad school). We'd effectively be replacing mathematical rigor with statistical confidence in domains where mistakes mean buildings might collapse or planes crash.
My thesis / hopes for the industry:
Video games might be the right beachhead market where visual plausibility is more important than real world fidelity.
There will likely be a hybrid simulation market. Safety critical industries will continue with PDEs. Surrogate or alternative approaches might be used for initial design discovery / iteration supplemented with traditional methods for verification.
Instead of neural networks learning grid based solutions, we'll (hopefully) develop models that operate directly on CAD geometry.
Academic CFD/FEM research is likely a dead end. Innovation will likely come from folks operating at the intersection of traditional CFD and some other field (e.g. ML). Papers on "improved mesh refinement" or "higher-order elements" are rearranging deck chairs.
If you're building ML-based simulation tools that bypass traditional PDE workflows - I’d love to chat.
* Hot take: Nvidia should look into acquiring Ansys