Math 350, Fall 2009

This course explores the process of building, analyzing, and interpreting mathematical descriptions of physical processes. Topics may include feature extraction, partial differential equations, neural networks, statistical models. The course will involve some computer programming, so previous programming experience is helpful, but not required. Prerequisite: Math 300 (Linear Algebra)

There is no required text for this course. Class notes will be provided.

- Sep 01-03: Introduction, Matlab and some Statistics
- Sep 08-10: Finish up stats and Matlab (See notes above)
- Notes on the stats we've been doing (Added Sep 8)
- Homework due on Friday, Sep 11: Homework set 1 from last week (see link above). This counts as Quiz 1.

- Sep 15-17: Linear Algebra and perhaps some Linear Programming
- Linear Algebra Fundamentals handout.
- Homework (9/15): Look at problems 1-4 on the last page of the previous handout.
- Passed out Quiz 2 on Thursday, due next Thursday once we've talked a bit more.

- Sep 22-24: Linear Algebra, continued.
- Finish up projections and Matlab coding for projections. HW: Finish Quiz 2.
- Continue with Linear Algbra.

- Sep 29-Oct 1: Exam passed out this week. Due on Monday, Oct 5th
at 4PM.
- Worked on ``The Best Basis'' chapter. Defined the covariance matrix and worked out how we will define the error for optimization of the basis vectors.
- We will not be getting tested on Matlab this time around (ran short of time).

- Oct 6-8: On Oct 6, we assigned Exercises 1(e) (write code called Line1.m), 1(f) (See the scan below for the setup), Exercise 3 (write the Median-Median line code as Line3.m), and apply the lines to the data in Exercise 5. On Oct 8, we continued in the notes, and assigned Exercises 1-4 on p. 6. Due: Oct 15.
- (Short week!) Oct 15: Run the experiment on page 11.
- M-file for the associative learning, Hebb's Rule (this is the code from the notes, pages 9 and 10)

- Oct 20-22: The Widrow-Hoff learning rule. Applications to time-series analysis. Gradient descent for minimization.
- Oct 27-29
- M-file: ExampleWriteUp.m
- How to turn in your write ups (download the previous M-file)
- A function that will plot contours (implot.m). Also available from Matlab's website.
- A script file showing how to plot contours and a trajectory of gradient descent (sampleImplot.m)
- (NOTE: THe homework due today has a new due date: Tuesday, Nov 3)

- Nov 3-5
- "Chapter 6" notes on the best basis (forgot to post it earlier).
- Function to produce a lagged matrix (lag.m)
- Widrow-Hoff (wid_hoff1.m, p. 16)
- Script file that implements the Example on p. 19 in detail. (scriptP19.m) (Updated Nov 5)
- Exam 2 Review Notes
- Review SOLUTIONS
- Computer Lab, Thursday
- Sunspot data (text file, sunspots.dat)
- Movie data for the last question in the lab.

- Nov 10-12: Exam week. In class exam on Nov 10, we spent Thursday in the lab (see Exam 2 materials below).
- Nov 17-19: Begin discussion of nonlinear regression. First up: The Radial Basis Functions.
- Dec 1-3: Lab materials for this week:
- Dec 8-10

Here are some sample files that show you how to train a set of Radial Basis Functions to do some classification:- Data for Matlab: IrisData.mat
- M-File: IrisClass1.m This script file explores the relationship between the classification error and the width of the Gaussians (in the transfer function).
- M-File: IrisClass2.m Example of how to analyze a classifier by building a confusion matrix.

- Exam 2 (Take-Home part)
- Items for Question 2:
- Items for Question 3: