Speaker
Description
This talk presents results on scaling laws in hypergraph and multi-operator learning. Scaling considerations simultaneously guide both the theoretical foundations and the practical design of modern learning systems. The first part shows how a large-data asymptotic analysis identifies connectivity-scaling regimes in which semi-supervised learning on hypergraphs is effective and stable. This perspective also leads to a principled taxonomy of hypergraph learning algorithms, organized by their underlying regularization mechanisms and induced continuum limits. The second part illustrates how scaling insights inform the design of expressive multi-operator networks and provide principled answers to the architectural search problem. Specifically, bounds are derived on the required network width, depth, and sparsity to achieve a prescribed approximation accuracy.