1University of Toronto, 2University of British Columbia
*Denotes Equal Contribution
Abstract
We introduce a neural field construction that captures gradient discontinuities without baking their location into the network weights. By augmenting input coordinates with a smoothly clamped distance function in a lifting framework, we enable encoding of gradient jumps at evolving interfaces.
This design supports discretization-agnostic simulation of parametrized shape families with heterogeneous materials and evolving creases, enabling new reduced-order capabilities such as shape morphing, interactive crease editing, and simulation of soft-rigid hybrid structures.
We further demonstrate that our method can be combined with previous lifting techniques to jointly capture both gradient and value discontinuities, supporting simultaneous cuts and creases within a unified model.
@inproceedings{Liu2025DiscontGrad,
title = {Precise Gradient Discontinuities in Neural Fields for Subspace Physics},
author = {Mengfei Liu and Yue Chang and Zhecheng Wang and Peter Yichen Chen and Eitan Grinspun},
booktitle = {ACM SIGGRAPH ASIA 2025 Conference Papers},
year = {2025}
}
Acknowledgements
We would like to thank our lab system administrator, John Hancock, and our financial officer, Xuan Dam, for their invaluable administrative support in making this research possible. We acknowledge the support of the Natural Sciences and Engineering Research Council of Canada (NSERC) grant RGPIN-2021-03733.