Foundations to Machine Learning
CS/Math 350, Spring 2025
This course explores the process of machine learning through the lens of empirical modeling. We will
develop the theory and algorithms that underpin the process of learning interesting things about data.
Algorithms we will develop typically include: singular value decomposition and eigenfaces, the n-armed
bandit, projections and linear regression, data clustering (k-means, Neural Gas, Kohonen's SOM),
linear neural networks, optimization algorithms, autoencoders and deep networks. The course will
involve some computer programming, so previous programming experience is helpful. May be elected
as Computer Science 350. Prerequisite: Mathematics 240.
There is no required text for this course. Class notes will be provided.
Daily Links
- Week 1:
- Wed: Intro to Machine Learning, some stats
- Fri: Working through the notes from Wed.
- Week 2:
- Mon: Finish up the stats, start linear regression.
- Notes for today on linear regression.
- Today, we finished up matrix-vector operations and talking about variance, correlation. Wed will be linear regression.
- Wed: Continue with the notes passed out last time, finish linear regression. Homework (linked from Friday) due on Canvas by midnight.
- Fri: New notes and homework below. Today we finish up the notes from Monday, then start in
on the new notes from today. The homework below will be due on Wednesday (there may be some more on Monday).
- Week 3 (Feb 3-7)
- Mon: Last Friday, we ended up talking about preprocessing data. Today we'll actually start
the chapter on linear algebra.
- Linear Algebra notes (Part 1)
- HOMEWORK due Wed, 11:59PM on Canvas: Problems 1, 2, 5 on the HW sheet from last Friday, plus problems 1 and 2
from today's handout (listed as pg 23). Some of these questions require computations- You may use a calculator
or Matlab/Octave to help with those, but be sure you write down what it is you are computing
in your solution (don't just turn in a page with numbers).
- Wed: Continuing with the linear algebra
- Fri: Continuing with linear algebra
- Week 4 (Feb 10-14)
- Mon: Today we'll finish the SVD (from Friday's notes).
- Homework for Wed (upload to Canvas): pg 47, #1,2 and pg. 49 #1, 3, 5. (Page numbers refer to last Friday, Week 3 notes).
- Wed: Last time, we discussed the notes to page 49 (linked last Fri). Today we continue.
Topics today:
- Relationship between the SVD and the 4 fundamental subspaces.
- The Reduced SVD
- Programming notes with the SVD
- The Moore-Penrose Pseudoinverse
- Fri: Finish the notes from Wed, start the "best basis".
- Week 5 (Feb 17-21)
- Week 6 (Feb 24-28)
- Mon: Review for exam.
- Wed: In-class portion of Exam 1.
- Fri: Linear Neural Nets
- Week 7 (Mar 3-7)
- Mon: Friday, we finished up SVD and factor analysis. The notes for today will
be the linear neural networks notes from Friday. Homework:
- Wed: Continuing from Monday
- Fri: K-nearest neighbor classification
- Week 8 (Mar 10-14)
- Mon: Finish up k-nearest neighbor classification, start data clustering.
- Wed: Continue with data clustering.
- Fri
Exam Links
- Exam 1 Links:
- Review Links:
- Take-Home Exam Links