In regression analysis, least squares, also known as ordinary least squares analysis is a method for linear regression that determines the values of unknown quantities in a statistical model by minimizing the sum of the residuals (difference between the predicted and observed values) squared. This method was first described by Carl Friedrich Gauss at the turn of the 19th century. Today, this method is available in most statistical packages. The least-squares approach to regression analysis has been shown to be optimal in the sense that it satisfies the Gauss-Markov theorem.
A related method is the least mean squares (LMS) method. It occurs when the number of measured data is 1 and the gradient descent method is used to minimize the squared residual. LMS is known to minimize the expectation of the squared residual, with the smallest number of operations per iteration). However, it requires a large number of iterations to converge.
Furthermore, many other types of optimization problems can be expressed in a least squares form, by either minimizing energy or maximizing entropy.