CSCE 421: Machine Learning (Spring 2020)
Instructor
Dr. Zhangyang (Atlas) Wang
Email: atlaswang@tamu.edu
Office: 328C HRBB (visit by appointment only except office hours)
Webpage: http://www.atlaswang.com
TA: Zhenyu “William” Wu
Email: wuzhenyu_sjtu@tamu.edu
Office: 407 HRBB
Webpage: https://wuzhenyusjtu.github.io/
Time and Location
· Lecture
time: 5:45 - 7:00 pm, every Monday and Wednesday
·
Lecture location: HRBB 124
·
Instructor Office Hour: 2:00 – 3:00 pm every Tuesday
·
TA Office Hour:
9:00 – 10:00 am
every Thursday
·
Class fully seated. NO AUDITION ALLOWED.
Course Description
Machine
learning is a sub-field of Artificial Intelligence that gives computers the
ability to learn and/or act without being explicitly programmed. Topics include
various supervised, unsupervised and reinforcement learning approaches
(including deep learning), optimization procedures, and statistical inference.
Course Goal
The
students will digest and practice their knowledge and skills by class
discussion and exams, and obtain in-depth experience
with a particular topic through a final project.
Evaluation Metrics
Grading
will be based on four take-home assignments (5% each), one mid-term exam (40%),
and one final project (40%)
(proposal 10% + presentation 10% + code review 10% + report 10%).
There will be no final exam.
Final project Collaborations and teamwork are encouraged, but must be coordinated and approved by the
instructor. A team can only have 2 members.
The
project proposal, report and codes should be all submitted via email. For late
submission, each additional late day will incur a 10% penalty.
The grading policy is as follows:
90-100: |
A |
80-89: |
B |
70-79: |
C |
60-69: |
D |
<60: |
F |
Project
It's
important that you work on a real machine learning project, or a real problem
in some relevant domain, so that you earn first-hand experience. The instructor
is available to discuss and shape the project. The scale of the project should
be scheduled as one semester long. This year, we will host a project
competition, and the scope details will be announced in class.
By the end
of the semester, you should submit your code and data for this project, write a
project report of maximum 8 pages (plus additional pages containing only
references) using the
standard CVPR paper template, and prepare a class presentation. The instructor
will be happy to help develop promising project ideas into a formal publication
during or after the semester, if you wish so.
Prerequisite
· Students
should have taken the following courses or equivalent: Data Structure and
Algorithms (CSCE 221), Linear Algebra (MATH 304 or MATH 323), and
Numerical Methods (MATH 417).
·
Coding experiences with Python, Matlab,
or C/C++ are assumed.
· Previous
knowledge of machine learning, computer vision signal processing or data mining
will be helpful, but not necessary.
Reading Materials
This
course does not follow any textbook closely. Among many recommended readings
are:
1. Introduction to Machine Learning, Ethem Alpaydin (2014), MIT Press. [Book
home page (3rd edition)] [Book
home page (2nd edition)] [Book
home page (1st edition)]
2. Pattern
Recognition and Machine Learning,
Christopher M. Bishop (2006). [A Bayesian view]
3. The Elements of
Statistical Learning, Jerome H. Friedman,
Robert Tibshirani, and Trevor Hastie (2001),
Springer. [Warning: not so elementary but quite insightful]
4. Sparse Coding
and its Applications in Computer Vision, Wang et. al. (2015), World Scientific.
5. Convex Optimization, Stephen Boyd and Lieven Vandenberghe
(2004), Cambridge University Press. [Their CVX toolbox is a great Matlab-based convex optimization tool for
beginners]
6. Linear Algebra and its
Applications,
Gilbert Strang (1988). [For those who want to simply keep a concise reference
for linear algebra, my best recommendation is The
Matrix Cookbook]
7. Deep Learning, Ian
Goodfellow, Yoshua Bengio
and Aaron Courville (2016), MIT Press.
8. Diving into Deep Learning, Aston
Zhang, Zack Lipton, Mu Li and Alex Smola (2019). [Specially Recommended]
Lecture
Notes (in PDF format) will be uploaded to the course webpage no more than 24
hours AFTER each class.
Attendance and Make-up Policies
Every student should attend the class, unless you have an
accepted excuse. Please check student rule 7 http://student-rules.tamu.edu/rule07
for details.
Academic Integrity
Aggie Code of Honor: An Aggie does not lie, cheat or
steal, or tolerate those who do. see: Honor Council Rules and Procedures
Americans with Disabilities Act (ADA) Statement
The Americans
with Disabilities Act (ADA) is a federal anti-discrimination statute that
provides comprehensive civil rights protection for persons with disabilities.
Among other things, this legislation requires that all students with
disabilities be guaranteed a learning environment that provides for reasonable
accommodation of their disabilities. If you believe you have a disability
requiring an accommodation, please contact Disability Services, currently
located in the Disability Services building at the Student Services at White
Creek complex on west campus or call 979-845-1637. For additional information,
visit http://disability.tamu.edu.
Week 1 |
|
01/13 |
1.
Introduction
[Link] |
01/15 |
2.
Linear Algebra Review [Link]
|
Week 2 |
|
01/20 |
No class (Martin Luther King, Jr. Day) |
01/22 |
3. Optimization Review (i) |
Week 3 |
(Due by Week 3 Sunday:
Register Project Teams) |
01/27 |
4.
Optimization
Review (ii) [Link] |
01/29 |
5. Probability
Review (i) |
Week 4 |
|
02/03 |
6. Probability
Review (ii) [Link] |
02/05 |
7. Linear
Classification [Link] |
Week 5 |
|
02/10 |
8.
More Classifiers, and Clustering (i) |
02/12 |
9.
More Classifiers, and Clustering (ii) |
Week 6 |
|
02/17 |
10.
More Classifiers, and Clustering (iii) [Link] |
02/19 |
11.
Support Vector Machines [Link] |
Week 7 |
|
02/24 |
12. Generalization
and Overfitting [Link] |
02/26 |
13. Dimensionality
Reduction (i) |
Week 8 |
(Due by Week 8 Sunday: Submit Project
Proposal) |
03/02 |
14. Dimensionality
Reduction (ii) [Link] |
03/04 |
15. Sparsity [Link] |
Week 9 |
|
03/09 |
No class (spring break) |
03/11 |
No class (spring break) |
Week 10 |
|
03/16 |
16.
Deep Learning (i): Basics [Links:
Entire DL Slides] |
03/18 |
Midterm Exam |
Week 11 |
|
03/23 |
17. Deep learning (ii): Representative Models |
03/25 |
18. Deep learning (ii): Representative Models
(cont’d) |
Week 12 |
|
03/30 |
19. Deep learning (ii): Representative Models
(cont’d) |
04/01 |
20.
Deep
learning (iii): Optimization and Implementation |
Week 13 |
|
04/06 |
21.
Deep Learning
(iv): Optimization and Implementation (cont’d) |
04/08 |
22.
Deep
Learning (v): Generative Models |
Week 14 |
|
04/13 |
23. Deep Learning (v): Transfer Learning |
04/15 |
24. Deep Learning (vi): Efficiency, Automation, and More |
Week
15 |
|
04/20 |
25.
Deep Learning
(vi): Efficiency, Automation, and More (cont’d) |
04/22 |
Final Project
Presentations (i) |
Week 16 |
|
04/27 |
Final Project
Presentations (ii) |