Участник:Strijov/Drafts

Материал из MachineLearning.

(Различия между версиями)
Перейти к: навигация, поиск
(Metric tensors and kernels)
(Graph neural networks)
Строка 338: Строка 338:
# Aronszajn, Nachman (1950). "Theory of Reproducing Kernels". Transactions of the American Mathematical Society. 68 (3): 337–404. [https://10.1090/S0002-9947-1950-0051437-7 DOI].
# Aronszajn, Nachman (1950). "Theory of Reproducing Kernels". Transactions of the American Mathematical Society. 68 (3): 337–404. [https://10.1090/S0002-9947-1950-0051437-7 DOI].
-
===Graph neural networks===
+
===Convolutions and Graphs===
# Gama, F. et al. Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks // IEEE Signal Processing Magazine, 2020, 37(6), 128-138, [https://doi.org/10.1109/MSP.2020.3016143 DOI].
# Gama, F. et al. Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks // IEEE Signal Processing Magazine, 2020, 37(6), 128-138, [https://doi.org/10.1109/MSP.2020.3016143 DOI].
# Zhou, J. et al. Graph neural networks: A review of methods and applications // AI Open, 2020, 1: 57-81, [https://doi.org/10.1016j.aiopen.2021.01.001 DOI], [https://arxiv.org/pdf/1812.08434.pdf ArXiv].
# Zhou, J. et al. Graph neural networks: A review of methods and applications // AI Open, 2020, 1: 57-81, [https://doi.org/10.1016j.aiopen.2021.01.001 DOI], [https://arxiv.org/pdf/1812.08434.pdf ArXiv].

Версия 19:13, 6 сентября 2021

Содержание


Machine Learning for Theoretical Physics

Physics-informed machine learning
(seminars by Andriy Graboviy and Vadim Strijov)

Goals

The course consists of a series of group discussions devoted to various aspects of data modelling in continuous spaces. It will reduce the gap between the models of theoretical physics and the noisy measurements, performed under complex experimental circumstances. To show the selected neural network is an adequate parametrisation of the modelled phenomenon, we use geometrical axiomatic approach. We discuss the role of manifolds, tensors and differential forms in the neural network-based model selection.

The basics for the course are the book Geometric Deep Learning: April 2021 by Michael Bronstein et al. and the paper Physics-informed machine learning // Nature: May 2021 by George Em Karniadakis et al.

Structure of the talk

The talk is based on two-page essay ([template]).

  1. Field and goals of a method or a model
  2. An overview of the method
  3. Notable authors and references
  4. Rigorous description, the theoretical part
  5. Algorithm and link to the code
  6. Application with plots

Grading

Each student presents two talks. Each talk lasts 25 minutes and concludes with a five-minute written test. A seminar presentation gives 1 point, a formatted seminar text gives 1 point, a test gives 1 point, a reasonable test response gives 0.1 point. Bonus 1 point for a great talk.

Test

Todo: how make a test creative, not automised? Here be the test format.

Themes

  1. Spherical harmonics for mechanical motion modelling
  2. Tensor representations of the Brain computer interfaces
  3. Multi-view, kernels and metric spaces for the BCI and Brain Imaging
  4. Continuous-Time Representation and Legendre Memory Units for BCI
  5. Riemannian geometry on Shapes and diffeomorphisms for fMRI
  6. The affine connection setting for transformation groups for fMRI
  7. Strain, rotation and stress tensors modelling with examples
  8. Differential forms and fibre bundles with examples
  9. Modelling gravity with machine learning approaches
  10. Geometric manifolds, the Levi-Chivita connection and curvature tensors
  11. Flows and topological spaces
  12. Application for Normalizing flow models (stress on spaces, not statistics)
  13. Alignment in higher dimensions with RNN
  14. Navier-Stokes equations and viscous flow
  15. Newtonian and Non-Newtonian Fluids in Pipe Flows Using Neural Networks [1], [2]
  16. Applications of Geometric Algebra and experior product
  17. High-order splines
  18. Forward and Backward Fourier transform and iPhone lidar imaging analysis
  19. Fourier, cosine and Laplace transform for 2,3,4D and higher dimensions
  20. Spectral analysis on meshes
  21. Graph convolution and continuous Laplace operators

Schedule

Thursdays on 12:30 at m1p.org/go_zoom

  • September 2 9 16 23 30
  • October 7 14 21 28
  • November 4 11 18 25 
  • December 2 9


Date Theme Speaker Links
September 2 Course introduction and motivation Vadim Strijov GDL paper, Physics-informed
9
9
16
16
23
23
30
30
October 7
7
14
14
21
21
28
28
November 4
4
11
11
18
18
25
25
December 2
2
9 Final discussion and grading Andriy Graboviy



  • Geometric deep learning
  • Functional data analysis
  • Applied mathematics for machine learning

General principles

1. The experiment and measurements defines axioms i


Syllabus and goals

Theme 1:

Message

Basics

Application

Code

https://papers.nips.cc/paper/2018/file/69386f6bb1dfed68692a24c8686939b9-Paper.pdf


Theme 1: Manifolds

Code

Surface differential geometry Coursera code video for Image and Video Processing

Theme 1: ODE and flows

Goes to BME

(after RBF)

Theme 1: PDE

Theme 1: Navier-Stokes equations and viscous flow

Fourier for fun and practice 1D

Fourier Code



Fourier for fun and practice nD

See:

  • Fourier analysis on Manifolds 5G page 49
  • Spectral analysis on meshes

Geometric Algebra

experior product and quaternions


Theme 1: High order splines

Theme 1: Topological data analysis

Theme 1: Homology versus homotopy

W: Homology



Fundamental theorems

W: Inverse function theorem and Jacobian

BCI, Matrix and tensor approximation

  1. Коренев, Г.В. Тензорное исчисление, 2000, 240 с., lib.mipt.ru.
  2.  Roger Penrose, "Applications of negative dimensional tensors," in Combinatorial Mathematics and its Applications, Academic Press (1971). See Vladimir Turaev, Quantum invariants of knots and 3-manifolds (1994), De Gruyter, p. 71 for a brief commentary PDF.
  3. Tai-Danae Bradley, At the Interface of Algebra and Statistics, 2020, ArXiv.
  4. Oseledets, I.V. Tensor-Train Decomposition //SIAM Journal on Scientific Computing, 2011, 33(5): 2295–2317, DOI, RG, lecture, GitHub, Tutoiral.
  5. Wikipedia: SVD, Multilinear subspace learning, HOSVD.

BCI, Feature selection

  1. Мотренко А.П. Выбор моделей прогнозирования мультикоррелирующих временных рядов (диссертация), 2019 PDF
  2. Исаченко Р.В. Снижение размерности пространства в задачах декодирования сигналов (дисссертация), 2021 PDF

High order partial least squares

  1. Qibin Zhao, et al. and A. Cichocki, Higher Order Partial Least Squares (HOPLS): A Generalized Multilinear Regression Method // IEEE Transactions on Pattern Analysis and Machine Intelligence, July 2013, pp. 1660-1673, vol. 35, DOI, ArXiv.

Neural ODEs and Continuous normalizing flows

  1. Ricky T. Q. Chen et al., Neural Ordinary Differential Equations // NIPS, 2018, ArXiv
  2. Johann Brehmera and Kyle Cranmera, Flows for simultaneous manifold learning and density estimation // NIPS, 2020, ArXiv

Continous time representation

  1. Самохина Алина, Непрерывное представление времени в задачах декодирования сигналов (магистерская диссертация): 2021 PDF, GitHub
  2. Aaron R Voelker, Ivana Kajić, Chris Eliasmith, Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks // NIPS, 2019, PDF,PDF.
  3. Functional data analysis: splines

Metric tensors and kernels

  1. Lynn Houthuys and Johan A. K. Suykens, Tensor Learning in Multi-view Kernel PCA // ICANN 2018, pp 205-215, DOI.

fRMI, Riemannian geometry on shapes

  1. Xavier Pennec, Stefan Sommer, and Tom Fletcher, Riemannian Geometric Statistics in Medical Image Analysis, 2019 book

Reproducing kernel Hilbert space

  1. Mauricio A. Alvarez, Lorenzo Rosasco, Neil D. Lawrence, Kernels for Vector-Valued Functions: a Review, 2012, ArXiv
  2. Pedro Domingos, Every Model Learned by Gradient Descent Is Approximately a Kernel Machine, 2020, ArXiv
  3. Wikipedia: RKHS
  4. Aronszajn, Nachman (1950). "Theory of Reproducing Kernels". Transactions of the American Mathematical Society. 68 (3): 337–404. DOI.

Convolutions and Graphs

  1. Gama, F. et al. Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks // IEEE Signal Processing Magazine, 2020, 37(6), 128-138, DOI.
  2. Zhou, J. et al. Graph neural networks: A review of methods and applications // AI Open, 2020, 1: 57-81, DOI, ArXiv.
  3. Zonghan, W. et al. A Comprehensive Survey on Graph Neural Networks // IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 4-24, DOI, ArXiv.
  4. Zhang, S. et al. Graph convolutional networks: a comprehensive review // Computational Social Networks, 2019, 6(11), DOI.
  5. Xie, Y. et al. Self-Supervised Learning of Graph Neural Networks: A Unified Review // ArXiv.
  6. Wikipedia: Laplacian matrix, Discrete Poisson's equation, Graph FT
  7. GNN papers collection

Higher order Fourier transform

  1. Zongyi Li et al., Fourier Neural Operator for Parametric Partial Differential Equations // ICLR, 2021, ArXiv

Spherical Regression

  1. Shuai Liao, Efstratios Gavves, Cees G. M. Snoek, Spherical Regression: Learning Viewpoints, Surface Normals and 3D Rotations on N-Spheres // CVPR, 2019, 9759-9767, ArXiv

Category theory

  1. Tai-Danae Bradley, What is Applied Category Theory?, 2018, ArXiv, demo.
Личные инструменты