Mathematical Modeling
Math 350, Fall 2014

This course explores the process of building, analyzing, and interpreting mathematical descriptions of physical processes. Topics may include feature extraction, partial differential equations, neural networks, statistical models. The course will involve some computer programming, so previous programming experience is helpful, but not required.

There is no required text for this course. Class notes will be provided.

Beginning Material

Text Handouts

Daily Schedule

  • Week 3
    1. Mon, Sep 15
    2. Tue, Sep 16: Genetic Algorithms.
    3. Thu, Sep 18: Finish Genetic Algorithms. Start Optimization if time.
  • Week 4
    1. Monday, Sep 22 (Today we finished GA's).
    2. Homework: Solve the Knapsack problem. Here is the fitness function, called testfunction2.m:
      testfunction2.m (Fitness function for the Knapsack problem).
      SOLUTION to Knapsack problem (ExerciseKnap.m)
    3. Tuesday, Sep 23: Start Optimization.
    4. Thursday, Sep 25: Finished the previous notes.
      HOMEWORK DUE ON MONDAY: The knapsack problem, in Matlab.
      HOMEWORK NOT DUE YET: Work on implementing Bisection and Newton in Matlab
      EXAM NOTE: Exam 1 is scheduled for next week- By unanimous proclamation, we moved it to THURSDAY, OCT 2.
  • Week 5
    1. Monday, Sep 29: Finish up the optimization.
      Optimization notes, updated to include Gradient Descent
      Solutions to Optimization Exercises
      Matlab file: bisect.m
      Matlab file: MultiNewton.m
      Matlab file: Ex4.m (Solution to Exercise 4)
    2. Tuesday, Sep 30: Review/Catch up day.
      Exam 1 Review Sheet
      Exam 1 Review Solutions
    3. Thursday, Oct 2: EXAM 1.
  • Week 6
    1. Mon, Oct 6
    2. Start on Data Clustering (Updated Oct 8). Homework: 1, 2, 3, 4, 6. Turn in next Monday: 2, 6.
    3. Tue, Oct 7: K-Means Clustering
      1. Matlab Function: edm.m (This is the function that computes the Euclidean Distance Matrix between all the data and the cluster centers).
      2. Matlab Function: kmeansUpdate.m (This performs one step of the K-means algorithm).
      3. Matlab script: ClusterExample01.m (Gives an example of how to use the k-means code).
      4. Matlab script: ClusterExample02.m (Same as previous script, except shows the Voronoi cells
      5. HOMEWORK: Download the m-files, try to publish ClusterExample01.m.
    4. Thu, Oct 9: Neural Gas
      1. Matlab Function: paramUpdate.m
      2. Matlab Function: initng.m
      3. Matlab Function: NeuralGas01.m
      4. Matlab Data: SixDatasets.mat
      5. Matlab Data: obstacle1.mat
      6. For fun: Video showing "Growing Neural Gas"
      7. Training Examples on Concentric Circles:
        1. TestNG1: Good parameter selection.
        2. TestNG2: Bad in two respects- The value of lambda is too big, as is the learning rate. This shoves all the centers towards the center of mass. Secondly, the age cutoff is too old- We're retaining "bad" edges.
        3. TestNG3: Similar to TestNG2. Cut off age is still too old.
        4. TestNG4: The cut off age is too "young"- We're not retaining connections long enough, and they're being deleted too early.
        5. The learning rate is too small. Notice that the centers are really not moving at all from where they were initialized.
  • Week 7:
    1. Thu, Oct 16: Download HW6scriptA.m and tell me what each of the lines of code does. Additionally, comment on what you see. Does k-means depend on the initial centers? (Also, be sure to continue workingg on the Neural Gas script from last week), the project on the obstacle course. Some extra scripts (Added Mon, Oct 20):
  • Week 8
    1. Mon, Oct 20: Finish Neural Gas, start Linear Networks
      Addition to Course Notes: Linear Networks
    2. Tue, Oct 21: Some stats and linear regression.
      Addition to Course Notes (Chapter 3): Statistics and Linear Regression
      Homework Assigned Week 8 (some Neural Gas, some about Line of Best Fit). Due on Monday, Oct 27. You can upload the Matlab files to CLEO- Make an "Oct27" file folder, please!
    3. Thu, Oct 23: Worked on Linear Neural Networks up through defining the Widrow-Hoff learning algorithm. No new homework assigned.
  • Week 9:
    1. Mon, Oct 27
      Homework: Go through the 10.4.1: Derivation of Widrow-Hoff (5 exercises).
    2. Tue, Oct 28
      Today we finished the linear neural nets and the discussion on covariance.
    3. Thu, Oct 30: Finished the discussion of the covariance and stats, started looking at the linear algebra notes. Nothing new for the HW (See Week 9 homework from Tuesday).
    Week 9 Solutions
    Breast Cancer Classifier 1
    Breast Cancer Classifier 2
  • Week 10
    1. Mon, Nov 03: Went through some linear algebra review. using updated notes here. Homework: #1, 4, 5 on pg. 55.
    2. Tue, Nov 04: More linear algebra. Looked at projections and subspaces, high and low dimensional representations.
    3. Thu, Nov 06: Finished up the linear algebra.
      Week 10 HW Summary
      Week 10 HW Solutions
  • Week 11
  • Week 12
  • Week 13:
  • Week 14

    Exam Information