Участник:Strijov/Drafts

Материал из MachineLearning.

(Различия между версиями)
Перейти к: навигация, поиск
(High order partial least squares)
(Содержимое страницы заменено на «{{Main|Численные методы обучения по прецедентам (практика, В.В. Стрижов)}} {{TOCr...»)
(29 промежуточных версий не показаны.)
Строка 3: Строка 3:
-
=Machine Learning for Theoretical Physics=
 
-
Physics-informed machine learning<br>
 
-
(seminars by Andriy Graboviy and Vadim Strijov)
 
-
 
-
==Goals==
 
-
The course consists of a series of group discussions devoted to various aspects of data modelling in continuous spaces. It will reduce the gap between the models of theoretical physics and the noisy measurements, performed under complex experimental circumstances. To show the selected neural network is an adequate parametrisation of the modelled phenomenon, we use geometrical axiomatic approach. We discuss the role of manifolds, tensors and differential forms in the neural network-based model selection.
 
-
 
-
The basics for the course are the book Geometric Deep Learning: April 2021 by Michael Bronstein et al. and the paper Physics-informed machine learning // Nature: May 2021 by George Em Karniadakis et al.
 
-
 
-
==Structure of the talk==
 
-
The talk is based on two-page essay ([template]).
 
-
# Field and goals of a method or a model
 
-
# An overview of the method
 
-
# Notable authors and references
 
-
# Rigorous description, the theoretical part
 
-
# Algorithm and link to the code
 
-
# Application with plots
 
-
 
-
==Grading==
 
-
Each student presents two talks. Each talk lasts 25 minutes and concludes with a five-minute written test.
 
-
A seminar presentation gives 1 point, a formatted seminar text gives 1 point, a test gives 1 point, a reasonable test response gives 0.1 point. Bonus 1 point for a great talk.
 
-
 
-
==Test==
 
-
Todo: how make a test creative, not automised? Here be the test format.
 
-
 
-
==Themes==
 
-
# Spherical harmonics for mechanical motion modelling
 
-
# Tensor representations of the Brain computer interfaces
 
-
# Multi-view, kernels and metric spaces for the BCI and Brain Imaging
 
-
# Continuous-Time Representation and Legendre Memory Units for BCI
 
-
# Riemannian geometry on Shapes and diffeomorphisms for fMRI
 
-
# The affine connection setting for transformation groups for fMRI
 
-
# Strain, rotation and stress tensors modelling with examples
 
-
# Differential forms and fibre bundles with examples
 
-
# Modelling gravity with machine learning approaches
 
-
# Geometric manifolds, the Levi-Chivita connection and curvature tensors
 
-
# Flows and topological spaces
 
-
# Application for Normalizing flow models (stress on spaces, not statistics)
 
-
# Alignment in higher dimensions with RNN
 
-
# Navier-Stokes equations and viscous flow
 
-
# Newtonian and Non-Newtonian Fluids in Pipe Flows Using Neural Networks [https://doi.org/10.2202/1556-3758.1079], [https://doi.org/10.1103/PhysRevE.102.043309]
 
-
# Applications of Geometric Algebra and experior product
 
-
# High-order splines
 
-
# Forward and Backward Fourier transform and iPhone lidar imaging analysis
 
-
# Fourier, cosine and Laplace transform for 2,3,4D and higher dimensions
 
-
# Spectral analysis on meshes
 
-
# Graph convolution and continuous Laplace operators
 
-
 
-
==Schedule==
 
-
Thursdays on 12:30 at m1p.org/go_zoom
 
-
* September 2 9 16 23 30
 
-
* October 7 14 21 28
 
-
* November 4 11 18 25 
 
-
* December 2 9
 
-
 
-
 
-
{|class="wikitable"
 
-
|-
 
-
! Date
 
-
! Theme
 
-
! Speaker
 
-
! Links
 
-
|-
 
-
|September 2
 
-
|Course introduction and motivation
 
-
|Vadim Strijov
 
-
|[https://geometricdeeplearning.com/ GDL paper], [https://www.researchgate.net/publication/351814752_Physics-informed_machine_learning/link/60ae8f43a6fdcc647ede90f7/download Physics-informed]
 
-
|-
 
-
|9
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|9
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|16
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|16
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|23
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|23
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|30
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|30
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|October 7
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|7
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|14
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|14
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|21
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|21
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|28
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|28
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|November 4
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|4
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|11
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|11
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|18
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|18
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|25
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|25
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|December 2
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|2
 
-
|
 
-
|
 
-
|
 
-
|-
 
-
|9
 
-
|Final discussion and grading
 
-
|Andriy Graboviy
 
-
|
 
-
|-
 
-
|}
 
-
 
-
 
-
 
-
 
-
 
-
* Geometric deep learning
 
-
* Functional data analysis
 
-
* Applied mathematics for machine learning
 
-
 
-
==General principles==
 
-
1. The experiment and measurements defines axioms i
 
-
 
-
 
-
 
-
==Syllabus and goals==
 
-
 
-
 
-
==Theme 1: ==
 
-
 
-
===Message===
 
-
 
-
 
-
===Basics===
 
-
 
-
 
-
===Application===
 
-
 
-
===Code===
 
-
https://papers.nips.cc/paper/2018/file/69386f6bb1dfed68692a24c8686939b9-Paper.pdf
 
-
 
-
 
-
 
-
==Theme 1: Manifolds ==
 
-
 
-
==Code==
 
-
Surface differential geometry
 
-
[https://www.coursera.org/lecture/image-processing/3-surface-differential-geometry-duration-11-43-vtuJ1 Coursera code video] for Image and Video Processing
 
-
 
-
==Theme 1: ODE and flows==
 
-
*[https://papers.nips.cc/paper/2018/hash/69386f6bb1dfed68692a24c8686939b9-Abstract.html Neural Ordinary Differential Equations] (source paper and code)
 
-
*[https://en.wikipedia.org/wiki/Flow-based_generative_model W: Flow-based generative model]
 
-
*[https://deepgenerativemodels.github.io/notes/flow/ Flows at deepgenerativemodels.github.io]
 
-
*[https://habr.com/ru/company/ods/blog/442002/ Знакомство с Neural ODE на хабре]
 
-
 
-
Goes to BME
 
-
*[https://arxiv.org/pdf/1505.05770.pdf Variational Inference with Normalizing Flows (source paper)]
 
-
*[https://lilianweng.github.io/lil-log/2018/10/13/flow-based-deep-generative-models.html Flow-based deep generative models]
 
-
 
-
(after RBF)
 
-
 
-
==Theme 1: PDE==
 
-
 
-
 
-
 
-
 
-
 
-
==Theme 1: Navier-Stokes equations and viscous flow==
 
-
 
-
== Fourier for fun and practice 1D==
 
-
 
-
[https://morioh.com/p/18b3158eab36?f=5c21fb01c16e2556b555ab32&fbclid=IwAR0FBF6IfmEaaedMEDzxiSclxGgNVweHgDQZympUZ-4doeKGNwQwUvp9upo Fourier Code]
 
-
 
-
 
-
 
-
 
-
== Fourier for fun and practice nD==
 
-
 
-
See:
 
-
* Fourier analysis on Manifolds 5G page 49
 
-
* Spectral analysis on meshes
 
-
 
-
== Geometric Algebra ==
 
-
experior product and quaternions
 
-
 
-
 
-
==Theme 1: High order splines==
 
-
 
-
 
-
 
-
 
-
==Theme 1: Topological data analysis==
 
-
 
-
 
-
 
-
 
-
 
-
==Theme 1: Homology versus homotopy==
 
-
[https://en.wikipedia.org/wiki/Homology_(mathematics) W: Homology]
 
Строка 302: Строка 10:
=Fundamental theorems=
=Fundamental theorems=
[https://en.wikipedia.org/wiki/Inverse_function_theorem W: Inverse function theorem and Jacobian]
[https://en.wikipedia.org/wiki/Inverse_function_theorem W: Inverse function theorem and Jacobian]
-
 
-
===BCI, Matrix and tensor approximation===
 
-
# Коренев, Г.В. Тензорное исчисление, 2000, 240 с., [https://lib.mipt.ru/book/7235/Korenev-GV-Tenzornoe-ischislenie.djvu lib.mipt.ru].
 
-
#  Roger Penrose, "Applications of negative dimensional tensors," in Combinatorial Mathematics and its Applications, Academic Press (1971). See Vladimir Turaev, Quantum invariants of knots and 3-manifolds (1994), De Gruyter, p. 71 for a brief commentary [https://www.mscs.dal.ca/~selinger/papers/graphical-bib/public/Penrose-applications-of-negative-dimensional-tensors.pdf PDF].
 
-
# Tai-Danae Bradley, At the Interface of Algebra and Statistics, 2020, [https://arxiv.org/abs/2004.05631 ArXiv].
 
-
# Oseledets, I.V. Tensor-Train Decomposition //SIAM Journal on Scientific Computing, 2011, 33(5): 2295–2317, [https://doi.org/10.1137/090752286 DOI], [https://www.researchgate.net/publication/220412263_Tensor-Train_Decomposition RG], [https://www-labs.iro.umontreal.ca/~grabus/courses/ift6760_files/lecture-11.pdf lecture], [https://github.com/oseledets/TT-Toolbox GitHub], [https://www.ifi.uzh.ch/dam/jcr:846e4588-673e-4f55-b531-544a2e1f602e/TA_Tutorial_Part2.pdf Tutoiral].
 
-
# Wikipedia: [https://en.wikipedia.org/wiki/Singular_value_decomposition#Matrix_approximation SVD], [https://en.wikipedia.org/wiki/Multilinear_subspace_learning Multilinear subspace learning], [https://en.wikipedia.org/wiki/Higher-order_singular_value_decomposition HOSVD].
 
-
 
-
===BCI, Feature selection===
 
-
# Мотренко А.П.
 
-
#
 
-
 
-
===High order partial least squares===
 
-
Qibin Zhao, et al. and A. Cichocki, Higher Order Partial Least Squares (HOPLS): A Generalized Multilinear Regression Method // IEEE Transactions on Pattern Analysis and Machine Intelligence, July 2013, pp. 1660-1673, vol. 35, [https://10.1109/TPAMI.2012.254 DOI].
 
-
 
-
===Neural ODEs and Continuous normalizing flows ===
 
-
# Ricky T. Q. Chen et al., Neural Ordinary Differential Equations // NIPS, 2018, [https://arxiv.org/abs/1806.07366 ArXiv]
 
-
# Johann Brehmera and Kyle Cranmera, Flows for simultaneous manifold learning and density estimation // NIPS, 2020, [https://arxiv.org/pdf/2003.13913.pdf ArXiv]
 
-
 
-
===Continous time representation===
 
-
# Самохина Алина, Непрерывное представление времени в задачах декодирования сигналов (магистерская диссертация): 2021 [http://www.machinelearning.ru/wiki/images/6/62/Samokhina2021MSThesis.pdf PDF], [https://github.com/Alina-Samokhina/MasterThesis GitHub]
 
-
# Aaron R Voelker, Ivana Kajić, Chris Eliasmith, Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks // NIPS, 2019, [https://openreview.net/forum?id=HyxlRHBlUB PDF],[https://papers.nips.cc/paper/2019/file/952285b9b7e7a1be5aa7849f32ffff05-Paper.pdf PDF].
 
-
# Functional data analysis: splines
 
-
 
-
===Metric tensors and kernels===
 
-
# Lynn HouthuysEmail authorJohan A. K. Suykens, Tensor Learning in Multi-view Kernel PCA // ICANN 2018, pp 205-215, [https://10.1007/978-3-030-01421-6_21 DOI].
 
-
 
-
=== Reproducing kernel Hilbert space ===
 
-
# Mauricio A. Alvarez, Lorenzo Rosasco, Neil D. Lawrence, Kernels for Vector-Valued Functions: a Review, 2012, [https://arxiv.org/abs/1106.6251 ArXiv]
 
-
# Pedro Domingos, Every Model Learned by Gradient Descent Is Approximately a Kernel Machine, 2020, [https://arxiv.org/pdf/2012.00152.pdf ArXiv]
 
-
# Wikipedia: [https://en.wikipedia.org/wiki/Reproducing_kernel_Hilbert_space RKHS]
 
-
# Aronszajn, Nachman (1950). "Theory of Reproducing Kernels". Transactions of the American Mathematical Society. 68 (3): 337–404. [https://10.1090/S0002-9947-1950-0051437-7 DOI].
 
-
 
-
===Graph neural networks===
 
-
# Gama, F. et al. Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks // IEEE Signal Processing Magazine, 2020, 37(6), 128-138, [https://doi.org/10.1109/MSP.2020.3016143 DOI].
 
-
# Zhou, J. et al. Graph neural networks: A review of methods and applications // AI Open, 2020, 1: 57-81, [https://doi.org/10.1016j.aiopen.2021.01.001 DOI], [https://arxiv.org/pdf/1812.08434.pdf ArXiv].
 
-
# Zonghan, W. et al. A Comprehensive Survey on Graph Neural Networks // IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 4-24, [https://10.1109/TNNLS.2020.2978386 DOI], [https://arxiv.org/pdf/1901.00596.pdf). ArXiv].
 
-
# Zhang, S. et al. Graph convolutional networks: a comprehensive review // Computational Social Networks, 2019, 6(11), [https://doi.org/10.1186/s40649-019-0069-y DOI].
 
-
# Xie, Y. et al. Self-Supervised Learning of Graph Neural Networks: A Unified Review // [https://arxiv.org/pdf/2102.10757.pdf ArXiv].
 
-
# Wikipedia: [https://en.wikipedia.org/wiki/Laplacian_matrix Laplacian matrix], [https://en.wikipedia.org/wiki/Discrete_Poisson_equation Discrete] [https://en.wikipedia.org/wiki/Poisson%27s_equation Poisson's equation], [https://en.wikipedia.org/wiki/Graph_Fourier_transform Graph FT]
 
-
# [https://github.com/thunlp/GNNPapers GNN papers collection]
 
-
 
-
===Higher order Fourier transform===
 
-
# Zongyi Li et al., Fourier Neural Operator for Parametric Partial Differential Equations // ICLR, 2021, [https://arxiv.org/abs/2010.08895 ArXiv]
 
-
 
-
===Spherical Regression===
 
-
# Shuai Liao, Efstratios Gavves, Cees G. M. Snoek, Spherical Regression: Learning Viewpoints, Surface Normals and 3D Rotations on N-Spheres // CVPR, 2019, 9759-9767, [https://openaccess.thecvf.com/content_CVPR_2019/html/Liao_Spherical_Regression_Learning_Viewpoints_Surface_Normals_and_3D_Rotations_on_CVPR_2019_paper.html ArXiv]
 
-
 
-
===Category theory===
 
-
# Tai-Danae Bradley, What is Applied Category Theory?, 2018, [https://arxiv.org/pdf/1809.05923.pdf ArXiv], [https://www.math3ma.com/ demo].
 

Версия 21:29, 6 сентября 2021

Содержание




Fundamental theorems

W: Inverse function theorem and Jacobian

Личные инструменты