Thursday, 25 October 2012

Least Square Method(Least Square Regression) & Differentiation Matlab Code


%%%%%%%%%%%%%%% LEAST SQUARE REGRESSION %%%%%%%%%%%%%%%%%%%%%%%%%

clear all
x=[10 20 30 40 50 60 70];
y=[25 70 380 550 610 1220 830];
[m1,n1] = size(x);
[m2,n2] = size(y);
if n1 ~= n2
    error('Invalid data')
end
SumX = sum(x);
SumY = sum(y);
SumXY = sum(x.*y);
SumSqrX = sum(x.*x);
WholeSqrX = SumX * SumX;
a1 = (n1*SumXY - SumX*SumY)/(n1*SumSqrX - WholeSqrX);
a0 = (SumY/n1) - a1*(SumX/n1);
Yhati = zeros(n1,1);
Yhati = a0 + a1.*x;
Sr = sum((y-Yhati).*y-Yhati);
St = sum((y-(SumY/n1)).*(y-(SumY/n1)));
r=sqrt((St-Sr)/St);
r
plot(x,y,'-k',x,Yhati,'-g'), grid on
legend('Original Curve','Curve Fitted');

%%%%%%%%%%%%%%%%%%%% DIFFERENTIATION %%%%%%%%%%%%%%%%%%%%%%%%%


clear all
clc
X=[0 10 20 30 45 60 75];
Y=[10 20 15 16 19 22 35];

n=length(X);
m=length(Y);

if n~=m
error('not equal');
end
D=zeros(n,1);

for i=1:n
if i==1
D(i) = (Y(i+1) - Y(i))/(X(i+1) - X(i)); % Forward Formula
end
if i>1 && i<n
D(i) = (Y(i+1) - Y(i-1))/(X(i+1) - X(i-1)); % Central Formula
end
if i==n
D(i) = (Y(i) - Y(i-1))/(X(i) - X(i-1)); % Backward Formula
end
end
D

Wednesday, 11 July 2012

Gauss Elimination (Solution System of Linear Equation)


Gaussian Elimination
We list the basic steps of Gaussian Elimination, a method to solve a system of linear equations. Except
for certain special cases, Gaussian Elimination is still \state of the art."" After outlining the method, we
will give some examples. Gaussian elimination is summarized by the following three steps:

Secant Method (Open Method)


Secant Method
The secant method requires two initial points x0 and x1 which are both reasonably close to the
solution x_. Preferably the signs of y0 = f(x0) and y1 = f(x1) should be different. Once x0 and x1
are determined the method proceeds by the following formula:
xi+1 = xi
xi xi1
yi yi1


Netwon Raphson


Newton Raphson Method
In numerical analysisNewton's method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real-valuedfunction. The algorithm is first in the class of Householder's methods, succeeded by Halley's method. The method can also be extended to complex functions and to systems of equations.

Monday, 9 July 2012

Ragula Falsi Method


Regula-Falsi Method
The Regula-Falsi Method (sometimes called the False Position Method) is a method used to find a numerical estimate of an equation. This method attempts to solve an equation of the form f(x) = 0. (This is very common in most numerical analysis applications.) Any equation can be written in this form. This algorithm requires a function f(x) and two points a and b for which f(x) is positive for one of the values and negative for the other. We can write this condition as f(a)×f(b)<0. Starting criteria is same as bisection method does, it takes two values to get it started.

Bisection Method


Bisection Method
The bisection method in mathematics is a root-finding method which repeatedly bisects an interval and then selects a subinterval in which a root must lie for further processing. It is a very simple and robust method, but it is also relatively slow. Because of this, it is often used to obtain a rough approximation to a solution which is then used as a starting point for more rapidly converging methods. The method is also called the binary search method and is similar to the Binary Search algorithm in computing, where the range of possible solutions is halved each iteration.