Suppose that we would like to estimate a functionf(x) given measurements{(xi, yi)}Ni=1. From this data how to we estimate the unknown functiony=f(x) given the assumption that our measurements of y contains measurement errors (or noise)? This is commonly called the regression problem. Clearly, simple interpolation of the noisy data will give a poor solution to the problem.
There are many partial solutions to the regression problem including fitting a straight line (or plane in higher dimensions) or perhaps fitting a quadratic function. But these solutions do not solve the more general problem where the unknown functionfcan not be approximated by a plane or a quadratic surface. The solutions that I want you to implement are the local linear and local quadratic regression in dimensions one and two. A brief write up for this set of problems, and their solution from the book on Statistical Learning is attached. Please note that the solution to this portion of the problem is not iterative, but at each pointx, the solution is given by solving a linear system.
The implementation of an automatic scheme for choosing the scaling parameter (i.e. the width of the kernel function ) should be included as part of the solution package. This portion requires the optimization of the one dimensional function. The user should be given the choice of specifying the scaling parameter or to sue the automatic scheme.
Please see me for more details if you are interested in this problem.
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here