CSCE 633: Machine Learning (Spring 2019)
Instructor
Dr. Zhangyang (Atlas) Wang
Email: atlaswang@tamu.edu
Office: 328C HRBB (visit by appointment only except
office hours)
Webpage: http://www.atlaswang.com
TA: Xiaohan Chen
Email: chernxh@tamu.edu
Office: 320 HRBB
Webpage: http://people.tamu.edu/~chernxh/
Time and Location
· Lecture
time: 5:30 - 6:45 pm, every Tuesday and Thursday
·
Lecture location: ZACH 244
·
Instructor Office Hour: 1:00 – 2:00 pm every Tuesday
·
TA Office Hour:
4:30 – 5:30
pm every Thursday
·
Class fully seated. NO AUDITION ALLOWED.
Course Description
Machine
learning is a sub-field of Artificial Intelligence that gives computers the
ability to learn and/or act without being explicitly programmed. Topics include
various supervised, unsupervised and reinforcement learning approaches
(including deep learning), optimization procedures, and statistical inference.
Course Goal
The
students will digest and practice their knowledge and skills by class
discussion and exams, and obtain in-depth experience with a particular topic
through a final project.
Evaluation Metrics
Grading
will be based on four take-home assignments (5% each), one mid-term exam (40%),
and one final project (40%)
(proposal 10% + presentation 10% + code review 10% + report 10%).
There will be no final exam.
Final project
Collaborations and teamwork are encouraged, but must be coordinated and
approved by the instructor. A team can only have 2 members.
The
project proposal, report and codes should be all submitted via email. For late
submission, each additional late day will incur a 10% penalty.
The grading policy is as follows:
90-100: |
A |
80-89: |
B |
70-79: |
C |
60-69: |
D |
<60: |
F |
Project
It's
important that you work on a real machine learning project, or a real problem
in some relevant domain, so that you earn first-hand experience. The instructor
is available to discuss and shape the project. The scale of the project should
be scheduled as one semester long. This year, we will host a project
competition, and the scope details will be announced in class.
By
the end of the semester, you should submit your code and data for this project,
write a project report of maximum 8 pages (plus additional pages containing
only references) using the
standard CVPR paper template, and prepare a class presentation. The
instructor will be happy to help develop promising project ideas into a formal
publication during or after the semester, if you wish so.
Prerequisite
· Students
should have taken the following courses or equivalent: Data Structure and
Algorithms (CSCE 221), Linear Algebra (MATH 304 or MATH 323), Numerical
Methods (MATH 417), and (preferably) Artificial Intelligence.
·
Coding experiences with Python, Matlab, or C/C++ are assumed.
· Previous
knowledge of machine learning, computer vision signal processing or data mining
will be helpful, but not necessary.
Reading Materials
This
course does not follow any textbook closely. Among many recommended readings
are:
1. Introduction to Machine Learning, Ethem Alpaydin (2014), MIT Press. [Book home page (3rd edition)] [Book
home page (2nd edition)] [Book
home page (1st edition)]
2. Pattern
Recognition and Machine Learning,
Christopher M. Bishop (2006). [A Bayesian view]
3. The Elements of
Statistical Learning, Jerome H.
Friedman, Robert Tibshirani, and Trevor Hastie
(2001), Springer. [Warning: not so elementary but quite insightful]
4. Sparse Coding
and its Applications in Computer Vision, Wang et. al. (2015), WorldScientific.
5. Convex
Optimization, Stephen Boyd and Lieven Vandenberghe
(2004), Cambridge University Press. [Their CVX toolbox is a great Matlab-based
convex optimization tool for beginners]
6. Distributed optimization and
statistical learning via the alternating direction method of multipliers, Stephen Boyd et. al. (2011). [Dedicated
reference for ADMM]
7.
Linear Algebra and its
Applications,
Gilbert Strang (1988). [For those who want to simply keep a concise reference
for linear algebra, my best recommendation is The
Matrix Cookbook]
8.
Deep
Learning,
Ian Goodfellow,
Yoshua Bengio and Aaron Courville (2016), MIT Press.
Lecture
Notes (in PDF format) will be uploaded to the course webpage no more than 24
hours AFTER each class.
Attendance and Make-up Policies
Every student should attend the class, unless you
have an accepted excuse. Please check student rule 7 http://student-rules.tamu.edu/rule07
for details.
Academic Integrity
Aggie Code of Honor: An Aggie does not lie, cheat
or steal, or tolerate those who do. see: Honor Council Rules and Procedures
Americans with Disabilities Act (ADA) Statement
The
Americans with Disabilities Act (ADA) is a federal anti-discrimination statute
that provides comprehensive civil rights protection for persons with
disabilities. Among other things, this legislation requires that all students
with disabilities be guaranteed a learning environment that provides for
reasonable accommodation of their disabilities. If you believe you have a
disability requiring an accommodation, please contact Disability Services,
currently located in the Disability Services building at the Student Services
at White Creek complex on west campus or call 979-845-1637. For additional
information, visit http://disability.tamu.edu.
Week 1 |
|
01/15 |
1.
Introduction (by TA; instructor travel) [Link] |
01/17 |
No Class
(instructor travel) |
Week 2 |
|
01/22 |
2.
Basic ML Theory and Concepts (i) |
01/24 |
3.
Basic ML Theory and Concepts (ii) [Link] |
Week 3 |
(Due by Week 3
Sunday: Register Project Teams) |
01/29 |
No Class
(instructor travel) |
01/31 |
No Class (instructor travel) |
Week 4 |
|
02/05 |
4.
Dimensionality
Reduction and Regression (i) |
02/07 |
5.
Dimensionality Reduction
and Regression (ii) [Link] |
Week 5 |
|
02/12 |
6.
Linear Classifier (i)
|
02/14 |
7.
Linear Classifier (ii) [Link] |
Week 6 |
|
02/19 |
8.
SVM Classifier and Kernel Methods (i) |
02/21 |
9.
SVM Classifier and Kernel Methods (ii) [Link] |
Week 7 |
|
02/26 |
10.
Other Popular
Classifiers, and Clustering (i) |
02/28 |
11.
Other Popular
Classifiers, and Clustering (ii) [Link] |
Week 8 |
(Due by Week 8 Sunday: Submit Project
Proposal) |
03/05 |
12.
Regularization:
Sparsity (i) |
03/07 |
Midterm Exam |
Week 9 |
|
03/12 |
No Class
(spring break) |
03/14 |
No Class
(spring break) |
Week 10 |
|
03/19 |
13.
Regularization:
Sparsity (ii) [Link] |
03/21 |
14.
Regularization: General Forms of Low
Dimensionality [Link] |
Week 11 |
|
03/26 |
15.
Multi-Task
Learning and Transfer Learning [Link]
|
03/28 |
16.
Deep Learning (i):
Basic Components
[Link] |
Week 12 |
|
04/02 |
17.
Deep Learning
(ii): Representative Models (1) |
04/04 |
18.
Deep Learning (ii): Representative Models
(2) [Link] |
Week 13 |
|
04/09 |
19.
Deep learning (iii): Optimization and
Implementation (1) |
04/11 |
20.
Deep Learning (iii): Optimization and
Implementation (2) [Link] |
Week 14 |
|
04/16 |
21.
Deep Learning
(iv): Applications in Computer Vision (1) [Link] |
04/18 |
22.Deep Learning (iv): Applications in Computer
Vision (2) [Link] |
Week 15 |
|
04/23 |
Final Project
Presentations (i) |
04/25 |
Final
Project Presentations (ii) |