SFI Seminar: How Size and Architecture Determine the Learning Capacity of Neural Networks

SFI News:

SFI Seminar: How Size and Architecture Determine the Learning Capacity of Neural Networks

Wednesday, Oct. 23 • 12:15 p.m. • Collins Conference Room at the Santa Fe Institute, 1399 Hyde Park Road.

Guido Montufar, Max Planck Institute for Mathematics in the Sciences

Abstract: Neural networks are artificial systems of computational units that can learn stochastic dependencies and coordinated behavior. These computational models serve to generate data representations, store and classify patterns, or generalize inferences.

The structure and size shape the space of things that the neural network can learn, its ability to cope with what it cannot learn, and the way it generalizes relations from training examples. So, for example, intuitively: deep architectures with several computational levels are able to learn higher-level features or more abstract data representations than shallow architectures. Distributed networks which store patterns and inference relations in an un-localized way can generalize better than localized ones. 

This talk takes a geometric perspective to addresses the learning capacity and distinctive features of neural networks with different sizes and architectures.

Note: We are unable to accommodate members of the public for SFI’s limited lunch service; you’re welcome to bring your own.

SFI Host: Nihat Ay

Click here to view the online event listing.

The Santa Fe Institute is a nonprofit research center in Santa Fe. Its scientists collaborate across disciplines to understand the complex systems that underlie critical questions for science and humanity. The Institute is supported by philanthropic individuals and foundations, forward-thinking partner companies, and government science agencies.

LOS ALAMOS

ladailypost.com website support locally by OviNuppi Systems

CSTsiteisloaded