gum() - Generalised Univariate Model

Ivan Svetunkov

2019-10-22

gum() constructs Generalised Exponential Smoothing - pure additive state-space model. It is a part of smooth package.

In this vignette we will use data from Mcomp package, so it is advised to install it.

Let’s load the necessary packages:

require(smooth)
require(Mcomp)

You may note that Mcomp depends on forecast package and if you load both forecast and smooth, then you will have a message that forecast() function is masked from the environment. There is nothing to be worried about - smooth uses this function for consistency purposes and has exactly the same original forecast() as in the forecast package. The inclusion of this function in smooth was done only in order not to include forecast in dependencies of the package.

Generalised Exponential Smoothing is a next step from CES. It is a state-space model in which all the matrices and vectors are estimated. It is very demanding in sample size, but is also insanely flexible.

A simple call by default constructs GUM\((1^1,1^m)\), where \(m\) is frequency of the data. So for our example with monthly data N2457, we will have GUM\((1^1,1^{12})\):

gum(M3$N2457$x, h=18, holdout=TRUE)
## Time elapsed: 0.4 seconds
## Model estimated: GUM(1[1],1[12])
## Persistence vector g:
##        [,1]   [,2]
## [1,] 0.2728 0.0048
## Transition matrix F: 
##        [,1]   [,2]
## [1,] 0.8802 0.7284
## [2,] 0.0396 0.8817
## Measurement vector w: 0.3864, 0.0788
## Initial values were optimised.
## 
## Loss function type: MSE; Loss function value: 1738726.8633
## Error standard deviation: 1499.584
## Sample size: 97
## Number of estimated parameters: 22
## Number of degrees of freedom: 75
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 1713.034 1726.710 1769.678 1800.959 
## 
## Forecast errors:
## MPE: 22.5%; sCE: -1760.7%; Bias: 86.2%; MAPE: 38.1%
## MASE: 2.782; sMAE: 113.5%; sMSE: 221.7%; rMAE: 1.188; rRMSE: 1.306

But some different orders and lags can be specified. For example:

gum(M3$N2457$x, h=18, holdout=TRUE, orders=c(2,1), lags=c(1,12))
## Time elapsed: 0.34 seconds
## Model estimated: GUM(2[1],1[12])
## Persistence vector g:
##        [,1]   [,2]    [,3]
## [1,] 0.1406 0.0922 -0.0748
## Transition matrix F: 
##        [,1]   [,2]   [,3]
## [1,] 0.5709 0.2337 0.0948
## [2,] 0.2530 0.6834 0.0637
## [3,] 0.1179 0.1087 0.9585
## Measurement vector w: 0.7454, 0.9889, 0.9921
## Initial values were optimised.
## 
## Loss function type: MSE; Loss function value: 1612418.4291
## Error standard deviation: 1527.873
## Sample size: 97
## Number of estimated parameters: 30
## Number of degrees of freedom: 67
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 1721.719 1749.901 1798.960 1863.422 
## 
## Forecast errors:
## MPE: 24%; sCE: -1779.5%; Bias: 87.4%; MAPE: 37.9%
## MASE: 2.764; sMAE: 112.8%; sMSE: 214.9%; rMAE: 1.181; rRMSE: 1.286

Function auto.gum() is now implemented in smooth, but it works slowly as it needs to check a large number of models:

auto.gum(M3[[2457]], interval=TRUE, silent=FALSE)
## Starting preliminary loop:            1 out of 122 out of 123 out of 124 out of 125 out of 126 out of 127 out of 128 out of 129 out of 1210 out of 1211 out of 1212 out of 12. Done.
## Searching for appropriate lags:  —\|/—\|/—\|/We found them!
## Searching for appropriate orders:  —\|/—\|/—Orders found.
## Reestimating the model. Done!
## Time elapsed: 27.58 seconds
## Model estimated: GUM(1[1],1[4])
## Persistence vector g:
##        [,1]   [,2]
## [1,] 0.4481 0.2939
## Transition matrix F: 
##        [,1]   [,2]
## [1,] 0.4174 0.4201
## [2,] 0.0745 1.0000
## Measurement vector w: 0.9982, 1
## Initial values were produced using backcasting.
## 
## Loss function type: MSE; Loss function value: 3334888.4992
## Error standard deviation: 1902.114
## Sample size: 115
## Number of estimated parameters: 9
## Number of degrees of freedom: 106
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 2071.650 2073.364 2096.354 2100.422 
## 
## 95% parametric prediction interval was constructed
## 44% of values are in the prediction interval
## Forecast errors:
## MPE: -175.7%; sCE: 2148.6%; Bias: -89.7%; MAPE: 181.5%
## MASE: 3.224; sMAE: 132.2%; sMSE: 237.6%; rMAE: 2.682; rRMSE: 2.276

In addition to standard values that other functions accept, GUM accepts predefined values for transition matrix, measurement and persistence vectors. For example, something more common can be passed to the function:

    transition <- matrix(c(1,0,0,1,1,0,0,0,1),3,3)
    measurement <- c(1,1,1)
    gum(M3$N2457$x, h=18, holdout=TRUE, orders=c(2,1), lags=c(1,12), transition=transition, measurement=measurement)
## Time elapsed: 0.31 seconds
## Model estimated: GUM(2[1],1[12])
## Persistence vector g:
##        [,1]   [,2]    [,3]
## [1,] 0.1469 0.0031 -0.0248
## Transition matrix F: 
##      [,1] [,2] [,3]
## [1,]    1    1    0
## [2,]    0    1    0
## [3,]    0    0    1
## Measurement vector w: 1, 1, 1
## Initial values were optimised.
## 
## Loss function type: MSE; Loss function value: 1793364.8934
## Error standard deviation: 1483.907
## Sample size: 97
## Number of estimated parameters: 18
## Number of provided parameters: 12
## Number of degrees of freedom: 79
## Information criteria:
##      AIC     AICc      BIC     BICc 
## 1708.036 1716.805 1754.380 1774.439 
## 
## Forecast errors:
## MPE: 17.6%; sCE: -1554.8%; Bias: 78.8%; MAPE: 35.7%
## MASE: 2.58; sMAE: 105.3%; sMSE: 199.8%; rMAE: 1.102; rRMSE: 1.24

The resulting model will be equivalent to ETS(A,A,A). However due to different initialisation of optimisers and different method of number of parameters calculation, gum() above and es(y, "AAA", h=h, holdout=TRUE) will lead to different models.