https://www.kdnuggets.com/2021/03/3-mathematical-laws.html
Zipf’s Law (Word Frequency)



Part 1: “Homology” without Pre-requisites, except “Function” (he un-rigourously interchanges with “Mapping”, although Function is stricter with 1-and-only-1 Image) .
Part 2: Simplex (单纯形)
Topology History : Euler Characteristic eg. (V – E + R = 2) ,
Poincaré invention
This video uses Algebra of point, line, triangle… to explain a Simplex (plural: Simplices) in R^{n} Space, that is organizing the n-Dimensional “Big Data” data points into Simplices, then (future Part 3, 4…) compute the “holes” (or pattern called Persistent Homology).
Part 3: Boundary
Part 3 justifies why triangles (formed by any 3 data points) called “Simplex” 单纯形(plural: Simplices) are best to fill any Big Data Space.
Differential Equations Versus Machine Learning | by Col Jung | Nov, 2020 | Medium
Data Science now applies Algebraic Topology : Persistent Homology.
https://en.m.wikipedia.org/wiki/Topological_data_analysis
李彦宏剑桥大学演讲
李彦宏 Baidu CEO Cambridge Speech 剑桥大学演讲
《3 waves of Internet》:
1) PC- based (1997-)
2) Mobile-based (2010 -)
3) AI-based (2017 – now)
“Hands-on Time Series Forecasting with Python” by Idil Ismiguzel https://link.medium.com/KulDiXl816
“Functional Programing in Data Science Projects” by Nathanael Weill https://link.medium.com/UiysKbFl16
The French Math distinguishes the Correspondence (对应):
in two definitions : Mapping (Application) & Function = Maximum 1 arrow from E to F.
Counter Examples:
Key points:
https://plus.maths.org/content/how-can-maths-fight-pandemic
…
Coding is about algorithm which is Math. The AI, Machine Learning, deep learning, Big Data, 5G Polar Codes, etc are all about Maths.
[MIT OCW Online Course Videos]
[Full Video]
AI and Big Data are Twins, their Mother is Math.
“AI 3.0“ today, although impressive in “DeepLearning“, is still using “primitive” high school Math, namely:
AI has not taken advantage of the power of post-Modern Math invented since WW II, esp. IT related, ie :
That is the argument of the Harvard Math Dean Prof ST Yau 丘城桐 (First Chinese Fields Medalist), who predicts the future “AI 4.0“ can be smarter and more powerful.
… Current AI deals with Big Data:
…
3. Lack effective Algorithms, esp. Algebraic Topology computes Homology or Co-homology using Linear Algebra (Matrices).
4. Limited by Hardware Speed (eg. GPU), reduced to layered-structure problem solving approach. It is a simple math analysis, not the REAL Boltzmann Machine which finds the most Optimum solution.
Notes:
AI 1.0 : 1950s by Alan Turing, MIT John McCarthy (coined the term “AI”, Lisp Language inventor).
AI 2.0 : 1970s/80s. “Rule-Based Expert Systems” using Fuzzy Logic.
[AI Winter : 1990s / 2000s. Failed ambitious Japanese “5th Generation Computer” based on Prolog-based “Predicate” Logic]
AI 3.0 : 2010s – now. “DeepLearning” by Prof Geoffry Hinton using primitive Math (Statistics, Probability, Calculus Gradient Descent)
AI 4.0 : Future. Using “Propositional Type” Logic, Topology (Homology, Homotopy) , Linear Algebra, Category.
https://www.analyticsvidhya.com/blog/2019/10/mathematics-behind-machine-learning/
Data Science & Machine Learning (AI is a sub-discipline) overlap but not the same:
“Statistical Significance Explained” by Will Koehrsen
Normal Distribution
Convert to Z-score
p-Value vs Alpha (0.05% = noise)
Advanced Programming needs Advanced Math: eg.
Video Game Animation: Verlet Integration
AI: Stats, Probability, Calculus, Linear Algebra
Search Engine : PageRank: Linear Algebra
Abstraction in Program “Polymorphism” : Monoid, Category, Functor, Monad
Program “Proof” : Propositions as Types, HoTT
https://awalterschulze.github.io/blog/post/neglecting-math-at-university/
Abstraction: Monoid, Category
Category
http://www.datastuff.tech/machine-learning/why-do-neural-networks-need-an-activation-function/
Activation function: Non-Linear Function
..
What if no Activation function : Affine Transformation
Advanced Mathematical Methods with AI is a powerful tool:
https://sinews.siam.org/Details-Page/mathematical-molecular-bioscience-and-biophysics-1