-
-
Notifications
You must be signed in to change notification settings - Fork 12
Open
Description
Summary
Implement LSM, which encodes input functions into a latent space, constructs an orthogonal spectral basis there, and composes mappings using multiple basis operators.
Reference
- Wu et al., "Solving High-Dimensional PDEs with Latent Spectral Models," ICML 2023. Paper
Description
LSM uses cross-attention to encode input functions into a latent space, then constructs orthogonal basis functions in that latent space (inspired by classical spectral methods). The operator mapping is decomposed into multiple basis operators, enabling efficient learning for high-dimensional PDEs.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels