The first documented work on regression 1 2 was published in the year 1898 by Schuster and since then, several regression models have been proposed. Regression is a active area of research because of the wide spread use of regression analysis in scientific, statistical, industrial and commercial applications. Ideally we would want a regression method which gives a perfect between the regression curve and the actual values of the data points. However the existing regression methods suffer form two major drawbacks:
(i) A regression method may be suitable for one type of data and unsuitable for another type of data. For example ordinary linear square is suitable for linear data but data in the real world are not necessarily linear therefore linear regression would not a good choice if the data is not roughly linear.
(ii) Unless there is a perfect match between the actual values of the data and the values given by the regression model, there will always be an error.
We present a new method of regression which works for all type of data; linear, polynomial, logarithmic or erratic random data that shows no particular trend. Our method is an iterative process based the application of sinusoidal series analysis in non linear least square. Each iteration of this method reduces the sum of the square of the residuals and therefore by successive iteration of this method, we show that every finite set of co-planar points can be expanded as a sinusoidal series in in_nitely many ways. In other words, given a set of co-planar points, we can fit infinitely many curves that pass through all these points. By setting a convergence criteria in terms of acceptable an error we can stop the iteration after a finite number of steps. Thus in the limiting case, we obtain a function that gives a perfect fit for the data points.
The regression method is published in ArXiv: Click here