clear echo on %This script illustrates some elementary properties %of the normal or Gaussian distribution. The normal %distribution is defined by the -probability density- %the probability of a measurement being in the range %[x,x+dx] divided by the width of the interval. It %is equal to the derivative of the cumulative %probability. For a Gaussian distribution, the density %has the form: % %p(x) = 1/((2*pi)^.5*sigma)*exp((x-mu)^2/(2*sigma^2)) % %Which is characterized by both the population mean mu %and the population standard deviation sigma. Let's plot %this distribution: pause mu=0; sigma=1; x=[-3:.01:3]; p = 1/((2*pi)^.5*sigma)*exp(-(x-mu).^2/(2*sigma^2)); figure(1) plot(x,p) set(gca,'FontSize',18) xlabel('x') ylabel('probability density') grid on pause %As you can see, the probability of a given measurement %occuring falls off rapidly as we move away from the mean. %We can look at the cumulative probability, which is either %obtained by integrating p(x), or (more conveniently) using %the function erf, defined as the integral of: %2/sqrt(pi) * exp(-t^2) from zero to x. Thus: P = 0.5 + erf((x-mu)/sigma/sqrt(2))/2; pause figure(1) plot(x,P) set(gca,'FontSize',18) xlabel('x') ylabel('cumulative probability') grid on %From this plot you can see that there is about a 31% probability %of a measurement lying 1 sigma above the mean or 1 sigma below %the mean. That reduces to 5% for 2 sigma, and so on. pause %OK, now how do we calculate the mean and the variance of a %distribution? First let's generate the distribution using the %randn command (this produces a normal distribution) n=400; z=randn(n,1); %Now let's plot up the cumulative distribution: zs=sort(z); cumprob=[1:n]/n; pause figure(1) plot(zs,cumprob) set(gca,'FontSize',18) xlabel('x') ylabel('cumulative probability') %You can see that this looks just like the cumulative probability %we had before, except it is alot more noisy. Let's compare the %two: pause hold on plot(x,P,'r') hold off legend('experimental distribution','gaussian fit') %which demonstrates the equivalence. %We can calculate the mean of the sample: mean=sum(z)/n %and the variance: variance = sum((z-mean).^2)/(n-1) %Note that the variance is much less accurate than the mean. %This is because we are trying to estimate a higher order moment, %and that in general takes a lot more data!. echo off