What are some great mathematics courses all over the world for someone who is passionately curious about mathematics (especially abstract and discrete maths), is not a math major but has above high school knowledge of mathematics?
Math Challenge: Teaching Subject
[Hint] : Assume 3 cases, then by Elimination using Mr Brown’s hint (somethingS right, somethingS wrong)
My favorite Fermat Little Theorem with Pascal Triangle
Fermat Little Theorem: For any prime integer p, any integer m
When m = 2,
Note: 九章算数 Fermat Little Theorem (m=2)
Pascal Triangle (1653 AD France ）= (杨辉三角 1238 AD – 1298 AD)
1 4 6 4 1 => sum = 16= 2^4 (4 is nonprime)
[PODCAST]
https://kpknudson.com/myfavoritetheorem/2017/9/13/episode4jordanellenberg
Pros and Cons of Neural Networks
Deep Learning enjoys a massive hype at the moment. People want to use Neural Networks everywhere, but are they always the right choice? That will be discussed in the following sections, along with why Deep Learning is so popular right now. After reading it, you will know the main disadvantages of Neural Networks and you will have a rough guideline when it comes to choosing the right type of algorithm for your current Machine Learning problem. You will also learn about what I think is one of the major problems in Machine Learning we are facing right now.
Table of Contents:

Why Deep Learning is so hyped
 Data
 Computational Power
 Algorithms
 Marketing

Neural Networks vs. traditional Algorithms
 Black Box
 Duration of Development
 Amount of Data
 Computationally Expensive
 Summary
 Conclusion
Why Deep Learning is so hyped
Deep Learning enjoys its current hype for four main reasons. These are data, computational power…
View original post 1,517 more words
The most addictive theorem in Applied mathematics
What is your favorite theorem ?
I have 2 theorems which trigger my love of Math :
 Chinese Remainder Theorem: 韩信点兵, named after a 200 BCE Han dynasty genius general Han Xin （韩信） who applied this modern “Modular Arithmetic” in battle fields.
 Fermat’s Last Theorem：The Math “prank” initiated by the 17CE amateur Mathematician Pierre de Fermat kept the world busy for 380 years until 1974 proved by Andrew Wiles.
Note 1: Lycée Pierre de Fermat (Classe Préparatoire) happens to be my alma mater named after this great Mathematician born in the same southern France “Airbus City” Toulouse.
Note 2: His another Fermat’s Little Theorem is used in modern computer cryptography.
Build Neural Network in Python from scratch
几何与计算数学的关系 – 丘成桐
Geometry and Computing Math
– Prof ST Yau (Harvard University Tenured Professor, Fields Medalist 1982, Wolf Prize 2010)
AI must be supported by solid Math Theory for it to be fully further developed.
This statement truly reflects the bottleneck faced by the AI 2.0 (Expert Systems) in the 1980s using a nonrigorous “Fuzzy Logic” Math.
Current AI 3.0 (Deep Learning) is using Calculus (Cauchy Gradient Descent) to compute, it is empirical and sans proven math theoretical support.
The new Math tools like Persistent Homology (持续同调论) , Comformal Geometry 共形几何, etc may be the answer for future AI 4.0.
Keywords:
1. 蒙日安培方程 MongoAmpere Equation
2. 共形(保角) 映射 Comformal Mapping
3. 仿射几何 Affine Geometry
4. 持续同调论 Persistent Homology
5. 叶状结构 Foliation Structure