Nouveaux domaines mathématiques émergents en physique théorique contemporaine

Lien vers le fichier PDF Tous les résumés publiés
Taille : 53 kb
Organisateur/-trice :
  • Robert Seiringer (Institute of Science and Technology Austria)
    • Gerard Ben Arous
      New York University
    Titre de la présentation non disponible
    • Giulio Biroli
      CEA Saclay
    Recent Results and Challenges in Glassy and Out of Equilibrium Dynamics
    Lien vers le fichier PDF Résumé en PDF
    Taille : 39 kb

    In this talk I will present different challenges arising in the study of dynamics of disordered systems in very high dimensions. The first topic I will focus on is the problem of activated stochastic dynamics, i.e. barrier crossing, in glassy systems. The second topic is the transition towards chaos in Lotka-Volterra equations modeling ecosystems formed by a very large number of species. I will present our recent results based on analysis of the dynamics by dynamical mean-field theory and of critical points of the energy landscape by the Kac-Rice method. I will conclude discussing perspectives both from the point of view of theoretical physics and math.

    Slides:  1430_biroli_montroyal_thursday.pdf
    Taille : 2 mb
    • Andrea Liu
      University of Pennsylvania
    Tuning functional networks
    Lien vers le fichier PDF Résumé en PDF
    Taille : 38 kb
    Nature is rife with networks that are functionally optimized to propagate inputs in order to perform specific tasks. Whether via genetic evolution or dynamic adaptation, many networks create functionality by locally tuning interactions between nodes. Here we explore this behavior in two contexts: strain propagation in mechanical networks and pressure redistribution in flow networks. By adding and removing links, we are able to optimize both types of networks to perform specific functions. We define a single function as a tuned response of a single “target" link when another, predetermined part of the network is activated. Using network structures generated via such optimization, we investigate how many simultaneous functions such networks can be programmed to fulfill. We find that both flow and mechanical networks display qualitatively similar phase transitions in the number of targets that can be tuned, along with the same robust finite size scaling behavior. We discuss how these properties can be understood in the context of a new class of constraint-satisfaction problems. We also show how persistent homology reveals topological changes that enable functionality.
    • Florent Krzakala
      Université Pierre et Marie Curie
    Statistical physics of learning (revisited)
    Lien vers le fichier PDF Résumé en PDF
    Taille : 38 kb

    Heuristic tools from statistical physics, in particular the replica method, have been used in the past to locate the phase transitions and compute the optimal learning and generalisation errors in many machine learning tasks. This field is currently witnessing an impressive revival. In this talk, we provide a rigorous justification of these approaches for high-dimensional generalized linear models --- used in signal processing, statistical inference, machine learning, communication theory and other fields--- and discuss computational to statistical gaps where the learning is possible, but computationally hard.

    Slides:  1530_krzakala_mont_royal_thursday.pdf
    Taille : 8 mb
    • Surya Ganguli
      Stanford University
    Statistical mechanics of deep learning
    Lien vers le fichier PDF Résumé en PDF
    Taille : 41 kb

    We will discuss several recent results in which ideas from statistical mechanics can yield both conceptual insights into the operation of deep networks as well as deliver better machine learning performance. These include: (1) Connecting the expressivity of deep networks to an order chaos transition in dynamic mean field theory, (2) Developing dynamically critical initial conditions for network weights that accelerate subsequent learning, (3) Creating generative models of complex data distributions by training neural networks to reverse the flow of time in entropy producing diffusion processes, and (4) Characterizing the non-convex geometry of deep network error landscapes using inspiration from the statistical mechanics of random landscapes.

    References:

    B. Poole, S. Lahiri, M. Raghu, J. Sohl-Dickstein, and S. Ganguli, Exponential expressivity in deep neural networks through transient chaos, Neural Information Processing Systems (NIPS) 2016.

    J. Pennington, S. Schloenholz, and S. Ganguli, Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice, Neural Information Processing Systems (NIPS) 2017.

    J. Sohl-Dickstein, N. Maheswaranathan, E. Weiss, S. Ganguli, Deep unsupervised learning using non-equilibrium thermodynamics, International Conference on Machine Learning (ICML) 2015.

    Y. Dauphin, R. Pascanu, C. Gulcehre, K. Cho, S. Ganguli, Y. Bengio, Identifying and attacking the saddle point problem in high-dimensional non-convex optimization, Neural Information Processing Systems (NIPS), 2014.

    M. Advani and S. Ganguli, Statistical mechanics of optimal convex inference in high dimensions, Physical Review X, 6, 031034, 2016.

    Slides:  1515_ganguli_mont_royal_fridayr.pdf
    Taille : 4 mb
    • Lenka Zdeborova
      CEA Saclay
    Capacity of the binary perceptron: Recent insights on an old open problem
    Lien vers le fichier PDF Résumé en PDF
    Taille : 39 kb
    Consider vertices of a high-dimensional hypercube. How many random cuts of the space in half are needed in order to remove all the vertices with probability one? This problem called "storage capacity of binary perceptron" in closely related to learning in simple neural networks. An explicit formula providing the answer was conjectured by Gardner & Derrida and Krauth & Mezard in 1989. Yet up to date the problem remains open. This talk will review what makes this problem so hard, and we will give recent results on variants of this problem, where rigorous answers were obtained.