Neural networks are powerful feature extractors - but which features do they extract from their data? And how does the structure of the training data shape the representations they learn? We investigate these questions by introducing several synthetic data models, each of which accounts for a salient feature of modern data sets: low intrinsic dimension of images [1], symmetries and...
We investigate the potential of tensor network based machine learning meth- ods to scale to large image and text data sets. For that, we study how the mutual information between a subregion and its complement scales with the subsystem size $L$, similarly to how it is done in quantum many-body physics. We find that for text, the mutual information scales as a power law $L^ν$ with a close to...
Since many concepts in theoretical physics are well known to scientists in the form of equations, it is possible to identify such concepts in non-conventional applications of neural networks to physics. In this talk, we examine what is learned by artificial neural networks, especially siamese networks in various physical domains. These networks intrinsically learn physical concepts like...