close all clear clc % Data represents housing prices. The first column % is square footage, second column is price (fictional). % We want to find a function that predicts price based % on the square footage. X=[1100 199000 1400 245000 1425 319000 1550 240000 1600 312000 1700 279000 1700 310000 1875 308000 2350 405000 2450 324000]; %% Step 0: Preprocess the data %plot(X(:,1),X(:,2),'k^-') % The first column of our data is on a completely % different scale than our second column. It is % good practice to keep the scale similar. In this % case, we'll scale so that the mean is zero and the % std is 1. mx=mean(X,1); sx=std(X,1); % Keep these around in case you need to rescale Xs=(X-mx)./sx; % new data. figure(1) plot(Xs(:,1),Xs(:,2),'k^-') % We'll break up the data for Step 2- x will be % our inputs, and t will be our target values % (or desired y-values). x=Xs(:,1); t=Xs(:,2); %Targets %% Step 1: Randomly initialize the training parameters % and other constants m=randn; b=randn; %Slope, Intercept for our unknown line. MaxIters=60; %Max number of times before we stop. alpha = 0.01; %Step size tol=1e-6; %How close to zero should grad f be to stop %% Main Loop: Steps 2-4 for i=1:MaxIters % Step 2: Calculate the error at each point y=m*x+b; % y is a vector (model output) ErrVec=t-y; %ErrVec is a vector (targets - model output) Error(i)=sum(ErrVec.*ErrVec); %The Error function %(what we're minimizing) % Step 3: Calculuate the gradient: temp1=ErrVec.*(-x); temp2=ErrVec.*(-1); %These are vectors Em=2*sum(temp1); Eb=2*sum(temp2); %These make up gradient. % Step 4: Adjust the parameters using gradient descent. m=m-alpha*Em; b=b-alpha*Eb; % Stop if the gradient is very close to zero fprintf('Gradient is %f\n',Em^2+Eb^2); if (Em^2+Eb^2)<=tol fprintf('Solution found in %d steps\n',i) break end end %% Closing: We'll visualize our results: figure(2) plot(Error) figure(1) hold on t=linspace(min(x),max(x)); z=m*t+b; plot(t,z,'r-'); hold off