Участник:Strijov/Drafts

Материал из MachineLearning.

(Различия между версиями)
Перейти к: навигация, поиск
(Содержимое страницы заменено на «{{Main|Численные методы обучения по прецедентам (практика, В.В. Стрижов)}} {{TOCr...»)
Строка 10: Строка 10:
=Fundamental theorems=
=Fundamental theorems=
[https://en.wikipedia.org/wiki/Inverse_function_theorem W: Inverse function theorem and Jacobian]
[https://en.wikipedia.org/wiki/Inverse_function_theorem W: Inverse function theorem and Jacobian]
-
 
-
Mathematical methods of forecasting
 
-
 
-
The lecture course and seminar introduces and applies methods of modern physics to the problems of machine learning.
 
-
 
-
Minimum topics to discuss: Geometric deep learning approach.
 
-
 
-
Optimum topics to discuss are: tensors, differential forms, Riemannian and differential geometry, metrics, differential operators in various spaces, embeddings, manifolds, bundles. We investigate scalar, vector and tensor fields (as well as jets, fibers and shiefs, tensor bundles, sheaf bundles etc.). The fields and spaces are one-dimensional, multidimensional and continuously dimensional.
 
-
 
-
===BCI, Matrix and tensor approximation===
 
-
# Коренев, Г.В. Тензорное исчисление, 2000, 240 с., [https://lib.mipt.ru/book/7235/Korenev-GV-Tenzornoe-ischislenie.djvu lib.mipt.ru].
 
-
#  Roger Penrose, "Applications of negative dimensional tensors," in Combinatorial Mathematics and its Applications, Academic Press (1971). See Vladimir Turaev, Quantum invariants of knots and 3-manifolds (1994), De Gruyter, p. 71 for a brief commentary [https://www.mscs.dal.ca/~selinger/papers/graphical-bib/public/Penrose-applications-of-negative-dimensional-tensors.pdf PDF].
 
-
# Tai-Danae Bradley, At the Interface of Algebra and Statistics, 2020, [https://arxiv.org/abs/2004.05631 ArXiv].
 
-
# Oseledets, I.V. Tensor-Train Decomposition //SIAM Journal on Scientific Computing, 2011, 33(5): 2295–2317, [https://doi.org/10.1137/090752286 DOI], [https://www.researchgate.net/publication/220412263_Tensor-Train_Decomposition RG], [https://www-labs.iro.umontreal.ca/~grabus/courses/ift6760_files/lecture-11.pdf lecture], [https://github.com/oseledets/TT-Toolbox GitHub], [https://www.ifi.uzh.ch/dam/jcr:846e4588-673e-4f55-b531-544a2e1f602e/TA_Tutorial_Part2.pdf Tutoiral].
 
-
# Wikipedia: [https://en.wikipedia.org/wiki/Singular_value_decomposition#Matrix_approximation SVD], [https://en.wikipedia.org/wiki/Multilinear_subspace_learning Multilinear subspace learning], [https://en.wikipedia.org/wiki/Higher-order_singular_value_decomposition HOSVD].
 
-
 
-
===BCI, Feature selection===
 
-
# Мотренко А.П. Выбор моделей прогнозирования мультикоррелирующих временных рядов (диссертация), 2019 [https://sourceforge.net/p/mlalgorithms/code/HEAD/tree/PhDThesis/Motrenko/doc/Motrenko2018Thesis.pdf PDF]
 
-
# Исаченко Р.В. Снижение размерности пространства в задачах декодирования сигналов (дисссертация), 2021 [https://github.com/r-isachenko/PhDThesis/blob/master/doc/Isachenko2021PhDThesis.pdf PDF]
 
-
 
-
===High order partial least squares===
 
-
# Qibin Zhao, et al. and A. Cichocki, Higher Order Partial Least Squares (HOPLS): A Generalized Multilinear Regression Method // IEEE Transactions on Pattern Analysis and Machine Intelligence, July 2013, pp. 1660-1673, vol. 35, [https://10.1109/TPAMI.2012.254 DOI], [https://arxiv.org/abs/1207.1230 ArXiv].
 
-
 
-
===Neural ODEs and Continuous normalizing flows ===
 
-
# Ricky T. Q. Chen et al., Neural Ordinary Differential Equations // NIPS, 2018, [https://arxiv.org/abs/1806.07366 ArXiv], [https://papers.nips.cc/paper/2018/hash/69386f6bb1dfed68692a24c8686939b9-Abstract.html source paper and code]
 
-
# Johann Brehmera and Kyle Cranmera, Flows for simultaneous manifold learning and density estimation // NIPS, 2020, [https://arxiv.org/pdf/2003.13913.pdf ArXiv]
 
-
#[https://deepgenerativemodels.github.io/notes/flow/ Flows at deepgenerativemodels.github.io]
 
-
#[https://lilianweng.github.io/lil-log/2018/10/13/flow-based-deep-generative-models.html Flow-based deep generative models]
 
-
#[https://arxiv.org/pdf/1505.05770.pdf Variational Inference with Normalizing Flows (source paper, Goes to BME)]
 
-
#[https://habr.com/ru/company/ods/blog/442002/ Знакомство с Neural ODE на хабре], [https://en.wikipedia.org/wiki/Flow-based_generative_model W: Flow-based generative model]
 
-
 
-
===Continous time representation===
 
-
# Самохина Алина, Непрерывное представление времени в задачах декодирования сигналов (магистерская диссертация): 2021 [http://www.machinelearning.ru/wiki/images/6/62/Samokhina2021MSThesis.pdf PDF], [https://github.com/Alina-Samokhina/MasterThesis GitHub]
 
-
# Aaron R Voelker, Ivana Kajić, Chris Eliasmith, Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks // NIPS, 2019, [https://openreview.net/forum?id=HyxlRHBlUB PDF],[https://papers.nips.cc/paper/2019/file/952285b9b7e7a1be5aa7849f32ffff05-Paper.pdf PDF].
 
-
# Functional data analysis: splines
 
-
 
-
===Navier-Stokes equations and viscous flow===
 
-
# Neural PDE
 
-
# Newtonian and Non-Newtonian Fluids in Pipe Flows Using Neural Networks [https://doi.org/10.2202/1556-3758.1079], [https://doi.org/10.1103/PhysRevE.102.043309]
 
-
===Metric tensors and kernels===
 
-
# Lynn Houthuys and Johan A. K. Suykens, Tensor Learning in Multi-view Kernel PCA // ICANN 2018, pp 205-215, [https://10.1007/978-3-030-01421-6_21 DOI].
 
-
 
-
===fRMI, Riemannian geometry on shapes===
 
-
# Xavier Pennec, Stefan Sommer, and Tom Fletcher, Riemannian Geometric Statistics in Medical Image Analysis, 2019 [https://www.elsevier.com/books/riemannian-geometric-statistics-in-medical-image-analysis/pennec/978-0-12-814725-2?format=print&utm_source=google_ads&utm_medium=paid_search&utm_campaign=frdsa-en&gclid=Cj0KCQjw-NaJBhDsARIsAAja6dPsj06W8cZ5mD15TS4MYPHqP4kbjExHfm55gxgQRzeDif2rmcWZEw4aApreEALw_wcB&gclsrc=aw.ds book]
 
-
# Surface differential geometry [https://www.coursera.org/lecture/image-processing/3-surface-differential-geometry-duration-11-43-vtuJ1 Coursera code video] for Image and Video Processing
 
-
 
-
===Spatial time series alignment===
 
-
# Titouan Vayer et al., Time Series Alignment with Global Invariances, 2020,[https://arxiv.org/abs/2002.03848 ArXiv]
 
-
# Marco Cuturi and Mathieu Blondel, Soft-DTW: a Differentiable Loss Function for Time-Series, [https://arxiv.org/pdf/1703.01541.pdf ArXiv]
 
-
# Marcel Campen et al., Scale-Invariant Directional Alignment of Surface Parametrizations // Eurographics Symposium on Geometry Processing, 2016, 35(5), [https://10.1111/cgf.12958 DOI]
 
-
# Helmut Pottmann et al. Geodesic Patterns // ACM Transactions on Graphics, 29(4), [https://10.1145/1833349.1778780 DOI], [https://www.researchgate.net/publication/220184673_Geodesic_Patterns PDF]
 
-
 
-
=== Reproducing kernel Hilbert space ===
 
-
# Mauricio A. Alvarez, Lorenzo Rosasco, Neil D. Lawrence, Kernels for Vector-Valued Functions: a Review, 2012, [https://arxiv.org/abs/1106.6251 ArXiv]
 
-
# Pedro Domingos, Every Model Learned by Gradient Descent Is Approximately a Kernel Machine, 2020, [https://arxiv.org/pdf/2012.00152.pdf ArXiv]
 
-
# Wikipedia: [https://en.wikipedia.org/wiki/Reproducing_kernel_Hilbert_space RKHS]
 
-
# Aronszajn, Nachman (1950). "Theory of Reproducing Kernels". Transactions of the American Mathematical Society. 68 (3): 337–404. [https://10.1090/S0002-9947-1950-0051437-7 DOI].
 
-
 
-
===Convolutions and Graphs===
 
-
# Gama, F. et al. Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks // IEEE Signal Processing Magazine, 2020, 37(6), 128-138, [https://doi.org/10.1109/MSP.2020.3016143 DOI].
 
-
# Zhou, J. et al. Graph neural networks: A review of methods and applications // AI Open, 2020, 1: 57-81, [https://doi.org/10.1016j.aiopen.2021.01.001 DOI], [https://arxiv.org/pdf/1812.08434.pdf ArXiv].
 
-
# Zonghan, W. et al. A Comprehensive Survey on Graph Neural Networks // IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 4-24, [https://10.1109/TNNLS.2020.2978386 DOI], [https://arxiv.org/pdf/1901.00596.pdf). ArXiv].
 
-
# Zhang, S. et al. Graph convolutional networks: a comprehensive review // Computational Social Networks, 2019, 6(11), [https://doi.org/10.1186/s40649-019-0069-y DOI].
 
-
# Xie, Y. et al. Self-Supervised Learning of Graph Neural Networks: A Unified Review // [https://arxiv.org/pdf/2102.10757.pdf ArXiv].
 
-
# Wikipedia: [https://en.wikipedia.org/wiki/Laplacian_matrix Laplacian matrix], [https://en.wikipedia.org/wiki/Discrete_Poisson_equation Discrete] [https://en.wikipedia.org/wiki/Poisson%27s_equation Poisson's equation], [https://en.wikipedia.org/wiki/Graph_Fourier_transform Graph FT]
 
-
# [https://github.com/thunlp/GNNPapers GNN papers collection]
 
-
 
-
===Higher order Fourier transform===
 
-
# Zongyi Li et al., Fourier Neural Operator for Parametric Partial Differential Equations // ICLR, 2021, [https://arxiv.org/abs/2010.08895 ArXiv]
 
-
# Fourier for fun and practice 1D [https://morioh.com/p/18b3158eab36?f=5c21fb01c16e2556b555ab32&fbclid=IwAR0FBF6IfmEaaedMEDzxiSclxGgNVweHgDQZympUZ-4doeKGNwQwUvp9upo Fourier Code]
 
-
# Fourier for fun and practice nD
 
-
#* Fourier analysis on Manifolds 5G page 49
 
-
#* Spectral analysis on meshes
 
-
 
-
===Spherical Regression===
 
-
# Shuai Liao, Efstratios Gavves, Cees G. M. Snoek, Spherical Regression: Learning Viewpoints, Surface Normals and 3D Rotations on N-Spheres // CVPR, 2019, 9759-9767, [https://openaccess.thecvf.com/content_CVPR_2019/html/Liao_Spherical_Regression_Learning_Viewpoints_Surface_Normals_and_3D_Rotations_on_CVPR_2019_paper.html ArXiv]
 
-
 
-
===Category theory===
 
-
# Tai-Danae Bradley, What is Applied Category Theory?, 2018, [https://arxiv.org/pdf/1809.05923.pdf ArXiv], [https://www.math3ma.com/ demo].
 
-
# F. William Lawvere, Conceptual Mathematics: A First Introduction to Categories, 2011, [https://img.4plebs.org/boards/tg/image/1460/05/1460059215690.pdf PDF].
 
-
# Картан А. Дифференциальное исчисление. Дифференциальные формы, 1971 [https://lib.mipt.ru/book/21021/Kartan-A-Differentsialnoe-ischislenie-Differentsialnye-formy.djvu lib.mipt.ru]
 
-
# Wikipedia: [https://en.wikipedia.org/wiki/Homology_(mathematics) Homology], Topological data analysis
 
-
=== Geometric algebra ===
 
-
# experior product and quaternions
 
-
# Nick Lucid, Advanced Theoretical Physics, 2019, [https://www.scienceasylum.com/AP_Sample.pdf sample].
 

Версия 21:29, 6 сентября 2021

Содержание




Fundamental theorems

W: Inverse function theorem and Jacobian

Личные инструменты