CSCE 633: Machine Learning (Spring 2018)

 

 

Instructor

Dr. Zhangyang (Atlas) Wang

Email: atlaswang@tamu.edu

Office: 328C HRBB (visit by appointment only except office hours)

Webpage: http://www.atlaswang.com

 

TA: Ye Yuan

Email: ye.yuan@tamu.edu

Office: 320 HRBB

 

Time and Location

·       Lecture time: 12:45 - 2:00 pm, every Tuesday and Thursday

·       Lecture location: CHEN 104

·       Instructor Office Hour: 2:00 3:00 pm every Thursday

·       TA Office Hour:  2:00 3:00 pm every Wednesday

·       Class fully seated. NO AUDITION ALLOWED.

 

Course Description

Machine learning is a sub-field of Artificial Intelligence that gives computers the ability to learn and/or act without being explicitly programmed. Topics include various supervised, unsupervised and reinforcement learning approaches (including deep learning), optimization procedures, and statistical inference.

 

Course Goal

The students will digest and practice their knowledge and skills by class discussion and course presentation, and obtain in-depth experience with a particular topic through a final project.

 

Evaluation Metrics

Grading will be based on three in-class quizzes (10% each), one mid-term exam (20%), and one final project (50%) (proposal 10% + presentation 15% + code review 10% + report 15%).  There will be no final exam. 

 

Final project Collaborations and teamwork are encouraged, but must be coordinated and approved by the instructor. A team can only have 2 members. Extra credits (>50%) will be given to:

-       One project to receive the Best Project Award, voted by all class members. (+5%)

-       Projects of interdisciplinary topics and novel application domains. (+2%)

The project proposal, report and codes should be all submitted via email. For late submission, each additional late day will incur a 10% penalty.

 

The grading policy is as follows:

90-100:

A

80-89:

B

70-79:

C

60-69:

D

<60:

F

 

Project

It's important that you work on a real machine learning project, or a real problem in some relevant domain, so that you earn first-hand experience how the computational models are bridged with the high complexity and uncertainty of the real world.

 

You're encouraged to develop your project ideas, or you can follow the suggested topic (details will be given in course slides). The instructor is available to discuss and shape the project. The scale of the project should be scheduled as one semester long.

 

By the end of the semester, you should submit your code and data for this project, write a project report of maximum 8 pages (plus additional pages containing only references) using the standard CVPR paper template, and prepare a class presentation. The instructor will be happy to help develop promising project ideas into a formal publication during or after the semester, if you wish so.

 

Prerequisite

·       Students should have taken the following courses or equivalent: Data Structure and Algorithms (CSCE 221), Linear Algebra (MATH 304 or MATH 323), Numerical Methods (MATH 417).

·       Coding experiences with Matlab, C/C++ or Python are assumed.

·       Previous knowledge of machine learning, computer vision signal processing or data mining will be helpful, but not necessary.

 

Reading Materials

This course does not follow any textbook closely. Among many recommended readings are:

1.     Introduction to Machine Learning, Ethem Alpaydin (2014), MIT Press. [Book home page (3rd edition)] [Book home page (2nd edition)] [Book home page (1st edition)]

2.     Pattern Recognition and Machine Learning, Christopher M. Bishop (2006). [A Bayesian view]

3.     The Elements of Statistical Learning, Jerome H. Friedman, Robert Tibshirani, and Trevor Hastie (2001), Springer. [Warning: not so elementary but quite insightful]

4.     Sparse Coding and its Applications in Computer Vision, Wang et. al. (2015), WorldScientific.

5.     Convex Optimization, Stephen Boyd and Lieven Vandenberghe (2004), Cambridge University Press. [Their CVX toolbox is a great Matlab-based convex optimization tool for beginners]

6.     Distributed optimization and statistical learning via the alternating direction method of multipliers, Stephen Boyd et. al. (2011). [Dedicated reference for ADMM]

7.     Linear Algebra and its Applications, Gilbert Strang (1988). [For those who want to simply keep a concise reference for linear algebra, my best recommendation is The Matrix Cookbook]

8.     Deep Learning, Ian Goodfellow, Yoshua Bengio and Aaron Courville (2016), MIT Press.

 

Lecture Notes (in PDF format) will be uploaded to the course webpage no more than 24 hours AFTER each class.

 

Attendance and Make-up Policies

Every student should attend the class, unless you have an accepted excuse. Please check student rule 7 http://student-rules.tamu.edu/rule07 for details.

 

Academic Integrity

Aggie Code of Honor: An Aggie does not lie, cheat or steal, or tolerate those who do. see: Honor Council Rules and Procedures

 

Americans with Disabilities Act (ADA) Statement

The Americans with Disabilities Act (ADA) is a federal anti-discrimination statute that provides comprehensive civil rights protection for persons with disabilities. Among other things, this legislation requires that all students with disabilities be guaranteed a learning environment that provides for reasonable accommodation of their disabilities. If you believe you have a disability requiring an accommodation, please contact Disability Services, currently located in the Disability Services building at the Student Services at White Creek complex on west campus or call 979-845-1637. For additional information, visit http://disability.tamu.edu.

 

Schedule*

(Further minor changes may occur due to potential changes of the instructor’s schedule, and will be notified separately via email)

 

Week 1

 

01/16

Introduction [Link]

01/18

Linear Algebra and Matrix Analysis [Link]

Week 2

 

01/23

No Class (scheduled travel)

01/25

Vector Space and Optimization

Week 3

 

01/30

Statistical Learning Theory

02/01

Dimensionality Reduction and Regression

Week 4

 

02/06

No Class (scheduled travel)

02/08

Classification and Clustering

Week 5

(Due by Week 5 Sunday: Submitting Project Proposal)

02/13

Support Vector Machine and Kernel Machine (i)

02/15

No Class (scheduled travel)

Week 6

 

02/20

Support Vector Machine and Kernel Machine (ii)

02/22

Sparse Learning (i)

Week 7

 

02/27

Midterm Exam

03/01

Sparse Learning (ii)

Week 8

 

03/06

Low-Dimensionality in High-Dimensional Spaces

03/08

Decision Tree, Random Forests and Ensemble

Week 9

 

03/13

No Class (Spring Break)

03/15

No Class (Spring Break)

Week 10

 

03/20

Multi-Task Learning and Transfer Learning

03/22

Deep Learning (i): History and Basics

Week 11

 

03/27

Guest Lecture: Transform Learning (Bihan Wen, UIUC)

03/29

Deep Learning (ii): Regularization Techniques

Week 12

 

04/03

Deep Learning (iii): Implementation Issues

04/05

Deep Learning (iv): New Trends and Tricks

Week 13

 

04/10

Deep Learning (v): Applications

04/12

Deep Learning (vi): Deep Reinforcement Learning

Week 14

 

04/17

Final Project Presentations (i)

04/19

Final Project Presentations (ii)

Week 15

 

04/24

Final Project Presentations (iii)

04/26

Final Project Presentations (iv)

Week 16

 

05/01

Final Project Presentations (v)