Abstract: | When using simple exponential smoothing on a given time series the nature of the relationship between the optimal smoothing constant and the autocorrelation structure of the series remains an unresolved question. Although numerical search routines can easily be used to find optimal values of the smoothing constant, they offer little insight into the nature of the relationship between the estimated smoothing constant and the structure of the underlying time series. We suggest that renewed investigations of the ex-post sum of squares function may prove helpful in this pursuit. Results are presented that illustrate how the optimal smoothing constant depends upon the value used to initialize the smoothing and upon the sample autocorrelation coefficients of the observed series. These results are based on a new formula for the derivative of the ex-post sum of squares function. In particular, the derivative is examined near 0 and 1, where great simplifications occur in its form, thereby facilitating investigations near these points. A necessary and sufficient condition is stated for when the ex-post sum of squares must have a positive derivative at 0 and the autocorrelation coefficients of the differenced series are shown to affect the sign of the derivative near 1. Based on these results, a general algorithm is presented as an alternative to grid search routines for minimizing the ex-post sum of squares. |