Free 45 lectures on Deep Learning.
- Neural Network began in 70s.
- AI in vogue in 80s, mainly Knowledge-based Expert System, inference engine only but NO self-learning capability.
- “AI Winter” in 90s.
- He could not get AI funding in UK.
- Refused the USA military funding, he moved to Toronto University with Canadian funding on pure Basic AI research.
- 4 decades of perseverence in Neural Network, he invented “DeepLearning” Algorithm using new approach (Machine ‘Self-learning’ capability by training in Big Data, learn from variance between output vs actual by using 19CE French mathematician Cauchy’s Calculus “Gradient Descent“. )
- Hinton thanks Canada for Basic Research Funding.
- Now working for Google.
Notes: The success of Hinton:
- Cross-discipline of 3 skills : (Psychiatrist + Math + IT) – Chinese proverb : 三个臭皮匠, 胜过一个诸葛亮 (3 ‘smelly’ cobblers beat the smartest military strategist Zhuge Liang in Chinese Three Kingdoms)
- Failures but with perseverence (4 decades)
- Courage (withstand loneliness) but with vision (see light at the end of tunnel)
- Look for condusive Research Environment : Canada Basic Research Funding
- Stick to his personal principle : Science for Peace of mankind, no ‘Military’ involvement.
Gradient Descent in Neural Network (Video here) :
Functional Programming “Clojure“-based DeepLearning :
An old trick Hashing taking advantage of tjr inherent sparsity in Big Data to reduce 90% time without loss of > 1% accuracy in data.
For examples : in picture recognition, many data are blanks consists of background (scenario, lighting), only less than 10% are striking pattern data which characterise the particular objects, like zebra trait, tiger body skin lines, elephant trunk, sunflower… when search data are stored in a matrix of billion columns and rows, 90% elements are 0.