Least Squares Method: Consider a given sample data { x1,y1} , {(x2,y2) ... (xi,yi)..(xn,yn)}. let yi be the observed value of a random variable Yi, where " Yi = β0 + β1xi + ϵi". The errors ϵi are independent random variables. If the line "y= β0 + β1x "is used to fit the model, the fitted values ^yi are obtained via ^yi= β0 + β1x. the residual " ei= yi-^yi= yi -β0 - β1xi" is the vertical deviation of the point (xi,yi) from the fitted line "y= β0+ β1x " The error sum of squares, denoted by SSE, is the sum of the squared residuals: SSE = ∑ (yi-^yi)^2