Self-Study Advanced Math

What are some great mathematics courses all over the world for someone who is passionately curious about mathematics (especially abstract and discrete maths), is not a math major but has above high school knowledge of mathematics?

https://www.quora.com/What-are-some-great-mathematics-courses-all-over-the-world-for-someone-who-is-passionately-curious-about-mathematics-especially-abstract-and-discrete-maths-is-not-a-math-major-but-has-above-high-school-knowledge-of/answer/Cornelius-Goh?share=6272be8a&srid=oZzP

Advertisements

My favorite Fermat Little Theorem with Pascal Triangle

Fermat Little Theorem: For any prime integer p, any integer m

\boxed {m^{p} \equiv m \mod p}

When m = 2,

\boxed{2^{p} \equiv 2 \mod p}

Note: 九章算数 Fermat Little Theorem (m=2)

Pascal Triangle (1653 AD France )= (杨辉三角 1238 AD – 1298 AD)

1 \: 1 \implies sum = 2 = 2^1 \equiv 2 \mod 1

1\: 2 \:1\implies sum = 4 = 2^2 \equiv 2 \mod 2 \;(\equiv 0 \mod 2)

1 \:3 \:3 \:1 \implies sum = 8= 2^3 \equiv 2 \mod 3

1 4 6 4 1 => sum = 16= 2^4 (4 is non-prime)

1 \:5 \:10\: 10\: 5\: 1 \implies sum = 32= 2^5 \equiv 2 \mod 5

[PODCAST]

https://kpknudson.com/my-favorite-theorem/2017/9/13/episode-4-jordan-ellenberg

Pros and Cons of Neural Networks

machinelearning-blog.com

Deep Learning enjoys a massive hype at the moment. People want to use Neural Networks everywhere, but are they always the right choice? That will be discussed in the following sections, along with why Deep Learning is so popular right now. After reading it, you will know the main disadvantages of Neural Networks and you will have a rough guideline when it comes to choosing the right type of algorithm for your current Machine Learning problem. You will also learn about what I think is one of the major problems in Machine Learning we are facing right now.

Table of Contents:

  • Why Deep Learning is so hyped
    • Data
    • Computational Power
    • Algorithms
    • Marketing
  • Neural Networks vs. traditional Algorithms
    • Black Box
    • Duration of Development
    • Amount of Data
    • Computationally Expensive
  • Summary
  • Conclusion

Why Deep Learning is so hyped

Deep Learning enjoys its current hype for four main reasons. These are data, computational power…

View original post 1,517 more words

The most addictive theorem in Applied mathematics

What is your favorite theorem ?

I have 2 theorems which trigger my love of Math :

  1. Chinese Remainder Theorem: 韩信点兵, named after a 200 BCE Han dynasty genius general Han Xin (韩信) who applied this modern “Modular Arithmetic” in battle fields.
  2. Fermat’s Last Theorem:The Math “prank” initiated by the 17CE amateur Mathematician Pierre de Fermat kept the world busy for 380 years until 1974 proved by Andrew Wiles.

Note 1: Lycée Pierre de Fermat (Classe Préparatoire) happens to be my alma mater named after this great Mathematician born in the same southern France “Airbus City” Toulouse.

Note 2: His another Fermat’s Little Theorem is used in modern computer cryptography.

https://blogs.scientificamerican.com/roots-of-unity/the-most-addictive-theorem-in-applied-mathematics/

几何与计算数学的关系 – 丘成桐

Geometry and Computing Math

– Prof ST Yau (Harvard University Tenured Professor, Fields Medalist 1982, Wolf Prize 2010)

AI must be supported by solid Math Theory for it to be fully further developed.

This statement truly reflects the bottle-neck faced by the AI 2.0 (Expert Systems) in the 1980s using a non-rigorous “Fuzzy Logic” Math.

Current AI 3.0 (Deep Learning) is using Calculus (Cauchy Gradient Descent) to compute, it is empirical and sans proven math theoretical support.

The new Math tools like Persistent Homology (持续同调论) , Comformal Geometry 共形几何, etc may be the answer for future AI 4.0.

Keywords:
1. 蒙日-安培方程 Mongo-Ampere Equation

2. 共形(保角) 映射 Comformal Mapping

3. 仿射几何 Affine Geometry

4. 持续同调论 Persistent Homology

5. 叶状结构 Foliation Structure

https://mp.weixin.qq.com/s/ziOeL_2SAdVFaPxXmMQl6g