Data Science =

Math (Fourier FFT) +

AI (Convoluted Neural Network) +

Big Data

https://www.kdnuggets.com/2020/02/fourier-transformation-data-scientist.html

Data Science =

Math (Fourier FFT) +

AI (Convoluted Neural Network) +

Big Data

https://www.kdnuggets.com/2020/02/fourier-transformation-data-scientist.html

AI and Big Data are Twins, their Mother is Math.

“AI 3.0“ today, although impressive in “DeepLearning“, is still using “primitive” high school Math, namely:

- Statistics,
- Probability (Bayesian) ,
- Calculus (Gradient Descent)

AI has not taken advantage of the power of post-Modern Math invented since WW II, esp. IT related, ie :

- Category Theory (Functional Programming),
- Algebraic Topology : Homology (Big Data Analytics)
- Homotopy Type Theory ‘HoTT’ (Machine Proof Math Theorems) .

That is the argument of the Harvard Math Dean Prof ST Yau 丘城桐 (First Chinese Fields Medalist), who predicts the future “AI 4.0“ can be smarter and more powerful.

… Current AI deals with Big Data:

- Purely Statistical approach and experience-oriented, not from Big Data’s inherent Mathematical structures (eg. Homology or Homotopy).
- The Data analytical result is environment specific, lacks portability to other environments.

…

3. Lack effective Algorithms, esp. Algebraic Topology computes Homology or Co-homology using Linear Algebra (Matrices).

4. Limited by Hardware Speed (eg. GPU), reduced to layered-structure problem solving approach. It is a simple math analysis, not the REAL **Boltzmann Machine** which finds the most Optimum solution.

**Notes**:

**AI 1.0** : 1950s by Alan Turing, MIT John McCarthy (coined the term “AI”, Lisp Language inventor).

**AI 2.0** : 1970s/80s. “Rule-Based Expert Systems” using Fuzzy Logic.

[**AI Winter** : 1990s / 2000s. Failed ambitious Japanese “5th Generation Computer” based on Prolog-based “Predicate” Logic]

**AI 3.0** : 2010s – now. “DeepLearning” by Prof Geoffry Hinton using primitive Math (Statistics, Probability, Calculus Gradient Descent)

**AI 4.0 :** Future. Using “Propositional Type” Logic, Topology (Homology, Homotopy) , Linear Algebra, Category.

Microsoft 40+ series of FREE Python Tutorials on Youtube : AI / Machine Learning, Data Analytics, Automation Scripting.

Github:

“No, Machine Learning is not just glorified Statistics” by Joe Davison https://link.medium.com/fv3z50FDYY

**Simplest explanation by Cheh Wu**:

(4 Parts Video : auto-play after each part)

**The Math Theory** behind Gradient Descent: “Multi-Variable Calculus” invented by Augustin-Louis Cauchy (19 CE, France)

**1. Revision: Dot Product of Vectors**

**2. Directional Derivati**ve

**3. Gradient Descent (opposite = Ascent**)

**Deeplearning with Gradient Descent:**

In memory of Prof Zhang Shoucheng 张首晟教授 who passed away on 1 Dec 2018.

Key Points :

- Quantum Computing with “
**Angel Particle**” (no anti-particle) : [Analogy] Complex Number (a + i.b) , ‘Anti’ = Conjugate = a – i.b, ‘No anti’ = Real number = a - A. I. Natural Language Algorithm : “
**Word To Vector**” eg. King / Queen (frequently appear together) , etc. - Data Privacy and Big Data Analytics with A. I. :
**Homomorphic Encryption**, ie reveal data but not privacy. (eg. Millionaire Problem)