The attractiveness of different estimators can be judged by looking at their properties, such as unbiasedness, mean square error, consistency, asymptotic distribution, etc..
For example, by minimizing the least absolute error rather than the least square error.
Efficiencies are often defined using the variance or mean square error as the measure of desirability.
This estimate is always biased; however, it usually has a smaller mean square error.
This is because the goal of the neural network is to minimize coding cost, not root mean square error.
One measure of closeness is the sum of square errors:
This is known as the minimum mean square error (MMSE) estimator.
However, one can use other estimators for which are proportional to , and an appropriate choice can always give a lower mean square error.
Computing the minimum mean square error then gives .
The most common test, the t-test, divides the abnormal returns through the root mean square error of the regression.