MATH3320 - Foundation of Data Analytics - 2021/22
Announcement
- There is no tutorial on Tuesday, September 7, 2021.
- A project is uploaded. Please check it below. Due date is 30th, Nov.
General Information
Lecturer
-
Prof. Zeng Tieyong
- Office: LSB225
- Tel: 39437966
- Email:
Teaching Assistant
-
Jianwei Niu
- Office: LSB 222A
- Tel: 3943 3575
- Email:
-
Shen Mao
- Office: AB1 614
- Tel: 3943 4109
- Email:
Time and Venue
- Lecture: Mo 9:30AM - 10:15AM; Tu 12:30PM - 2:15PM (Mong Man Wai Bldg 702)
- Tutorial: Tu 9:30AM - 10:15AM (Yasumoto Int'l Acad Park 201)
Course Description
This course gives an introduction to computational data analytics, with emphasis on its mathematical foundations. The goal is to carefully develop and explore mathematical theories and methods that make up the backbone of modern mathematical data sciences, such as knowledge discovery in databases, machine learning, and mathematical artificial intelligence. Topics include mathematical foundations of probability, linear approximation and its polynomial and high dimensional extensions, proper orthogonal decomposition methods, optimization, theories of nonlinear neural network and approximations. Students taking this course are expected to have knowledge of basic linear algebra.
Advisory: MATH Majors should select not more than 5 MATH courses in a term.
Textbooks
- Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep Learning, The MIT Press, 2016.
- Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.
- Kevin P. Murphy, Machine Learning: A Probabilistic Perspective, The MIT Press, 2012.:
- "Mathematics for Machine Learning" by Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, Cambridge University Press.
References
- Shai Shalev-Shwartz and Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014
- Richard Duda, Peter Hart and David Stock,Pattern Classification, Wiley-Interscience, 2nd Edition, 2015.
- Tom Mitchell, Machine Learning, 1st Edition, McGraw-Hill, 1997
Pre-class Notes
- linear approximation
- Estimation
- Estimation_MLE
- Classfication
- Gradient Descent
- Gradient Descent
- Cross validation
- Bayes
- Bayes Regression
- k-means clustering
- SVM-read this (Nov 15, 2021)
- K-NN
- PCA
- Probability
- Mixtures of Gaussians
- Mixtures of Gaussians (Video)
- Introduction to Deep Learning (MIT)
Lecture Notes
Class Notes
- Notes on Linear Algebra (Jean Walrand)
- Linear Algebra
- Topics in Matrix Theory(SVD)-Sept9-2021
- More on Multivariate Gaussians (Stanford)
- The Rank-Nullity Theorem
- Spectral Theorem
- Cholesky decomposition-Sept8-2021
- SVD (MIT)-Sept9-2021
- Probability Theory (Introduction)
- Optimization for Machine Learning (ENS)
- General EM algorithm
- SVM
Tutorial Notes
- Tutorial 1
- Turorial 2
- Turorial 3
- Tutorial 4
- Tutorial 5
- Tutorial 6
- Tutorial 7
- Tutorial 8
- Tutorial 9
- Tutorial 10
- Tutorial 11
Assignments
Quizzes and Exams
Solutions
- Solution 1
- Solution 2
- Solution 3
- Solution 4
- Solution 5
- Solution 6
- Solution 7
- Solution 8
- Solution 9
- Solution 10
- Solution 11
Assessment Scheme
Tutorial attendance & good efforts | 10% | |
Mid-Exam | 12.5% | |
Homework/Project | 12.5% | |
Final Exam | 65% | |
Back-up Plan: In case face-to-face teaching and assessment is not possible due to the pandemic, the assessment will be changed to: Tutorial and homework 30%; Midterm 35% ; Project 35% | % |
Useful Links
- Introduction to Machine Learning
- Foundation of Data Science
- A Comprehensive Guide to Machine Learning
- Introduction to Monte Carlo
- PCA
- K-means
- K-Medoids
- Mixtures of Gaussian
- scikit-learn Machine Learning in Python
- Mixtures of Gaussian
- Hidden Markov Models
- Support Vector Machines(Andrew Ng)
- Machine Learning(Andrew Ng)
- Hidden Markov Models
- Neural Networks and Introduction to Deep Learning
- CNN-Li Feifei
- Deep Learning (Adrew Ng)
- LSTM
- Introduction to Machine Learning
- Lasso
- Machine Learning for OR & FE (Columbia University)
- CS229: Machine Learning (Stanford)
- Mathematics for Machine Learning
Honesty in Academic Work
The Chinese University of Hong Kong places very high importance on honesty in academic work submitted by students, and adopts a policy of zero tolerance on cheating and plagiarism. Any related offence will lead to disciplinary action including termination of studies at the University. Although cases of cheating or plagiarism are rare at the University, everyone should make himself / herself familiar with the content of the following website:
http://www.cuhk.edu.hk/policy/academichonesty/and thereby help avoid any practice that would not be acceptable.
Assessment Policy Last updated: December 03, 2021 12:44:38