CS 535: Machine Learning I (Fall 2025)


Course Information


Instructor: Hao Wang
Email: hogue.wang@rutgers.edu
Office: CBIM 008

TA: Aneesha Fatima and Kunal Pasad
TA Email: af1302@scarletmail.rutgers.edu and kp1462@scarletmail.rutgers.edu
TA Office: CoRE

Time: Tuesday, 12:10 pm-3:10 pm
Location: PH-115

Office Hours: Wednesday, 3:00-4:00 pm or by appointment
Please use this Zoom link.

TA Office Hours: Friday, 3:00-4:00 pm or by appointment
Please use this Zoom link.


Announcements

  • TBD.


  • Course Descriptions

    In this course, we will cover the following topics in machine learning:

  • Fundamentals:
  • Probability & random variables. Univariate & multivariate.
  • Maximum likelihood estimation. Risk & empirical risk minimization.
  • Decision theory & Information theory for Machine Learning.
  • Linear algebra for Machine Learning. Vector spaces. Matrix decomposition. Matrix calculus.
  • Optimization. First & second order methods. Stochastic gradient descent. Constrained optimization.
  • Linear discriminant analysis.
  • Logistic regression.
  • Linear regression.
  • Generalized linear models.
  • Exemplar-based methods. k nearest neighbors. Learning distance metrics. Kernel density estimation.
  • Kernel methods. Gaussian processes. Support vector machines and relevance vector machines.
  • Unsupervised learning and dimensionality reduction. Factor analysis and manifold learning.
  • Clustering. k means. Mixture models. Spectral clustering.


  • Prerequisites

  • CS 520 (Introduction to AI) or CS 530 (Principles of AI).
  • Students must be familiar with python programming. The mini-project (and most probably the course project) will be based on python. You could use your only computer, Rutgers iLab, or Google Colab to run the code.


  • Expected Work

  • 20%: Mini-Project
  • 80%: Course Project (Proposal, Presentation, and Report)

  • Infrastructure Requirements

  • Please visit the Rutgers Student Tech Guide page for resources available to all students. If you do not have the appropriate technology for financial reasons, please email Dean of Students deanofstudents@echo.rutgers.edu for assistance. If you are facing other financial hardships, please visit the Office of Financial Aid at https://financialaid.rutgers.edu/.


  • Computing Resources

  • Rutgers iLab Servers: https://report.cs.rutgers.edu/nagiosnotes/iLab-machines.html
  • Google Colab: https://colab.research.google.com


  • Textbooks and Materials

    There is no textbook for this course. However, the following books can be useful though as references on relevant topics:

  • Pattern Recognition and Machine Learning (PRML), Christopher C. Bishop, Springer, 2006, ISBN: 9780387310732 (link)
  • Machine Learning: A Probabilistic Perspective, Kevin Murphy, MIT Press, 2012/2022 (link)
  • Deep Learning (DL), Goodfellow, Ian and Bengio, Yoshua and Courville, Aaron, MIT Press, 2016, ISBN: 9780262035613 (link)
  • Dive into Deep Learning (link)
  • Hands-On Machine Learning with Scikit-Learn, Keras, and Tensorflow (link)
  • Natural Language Processing with Distributed Representations (NLP), Kyunghyun Cho, https://arxiv.org/abs/1511.07916

  • If you plan to work on the course project using deep learning, you may also find this tuturial Deep Learning with PyTorch: A 60 Minute Blitz helpful.

    Tips and More Details on Final Projects

    Below are some tips and details.

    Team Forming: Students can form a team of at most four.

    Tips for Course Project Presentation: Here are some tips that can make the project presentations smoother and more effective:

  • If the project is based on published papers, at the start of presentation you could introduce some background and details on the paper. To do this more effectively, you could look for demo videos on the Internet. Showing the demo as part of the presentation can tremendously help others understand the paper. You could also look for talks/slides on the paper available online (from YouTube or the authors’ websites). You can re-use some of the content if you feel it is helpful.
  • Pause for questions. At some points of your presentation, you could pause and ask if there are any questions.


  • Tentative Schedule

    Note that this is a tentative syllabus to give you an idea of what topics this course will cover. This syllabus is subject to change as the course progresses.

    Week Date Topic Assignment
    Machine Learning Basics and Fundamentals
    1 Sep 2 Course Introduction and Machine Learning Basics Warmup Exercise Release
    2 Sep 9 Probability Univariate, Probability Multivariate, Statistics
    3 Sep 16 Decision Theory, Information Theory, Linear Algebra & Matrix Calculus; Optimization
    Linear Models and Their Bayesian Treatments
    4 Sep 23 Linear Generative Models; Linear Discriminant Analysis Mini Project Release
    5 Sep 30 Linear Discriminant Analysis
    6 Oct 7 Logistic Regression; Imbalanced Classification and Regression
    7 Oct 14 Bayesian Logistic Regression; Vanilla, Ridge, and Lasso Linear Regression
    8 Oct 21 Probabilistic and Bayesian Linear Regression; Exponential Family and Conjugate Priors
    9 Oct 28 (Bi)Linear Models: Recommender Systems and Latent Factor Models
    Kernel Methods
    10 Nov 4 Kernel Methods I
    11 Nov 11 Kernel Methods II
    12 Nov 18 Support Vector Machines (SVM)
    13 Nov 25 Thanksgiving Recess
    Mini Conference
    14 Dec 2 Final Project Presentation (Mini Conference) I
    15 Dec 9 Final Project Presentation (Mini Conference) II