SDSC6012 - Assignment 1
#assignment #sdsc6012
Question 1
Trend Component Extraction (Moving Average Method)
The trend component is extracted using the centered moving average method:
Trend t = 1 k ∑ i = t − m t + m x i \text{Trend}_t = \frac{1}{k} \sum_{i=t-m}^{t+m} x_i
Trend t = k 1 i = t − m ∑ t + m x i
Where:
k k k is the window size (here set to 12, corresponding to the annual cycle)
m = ⌊ k / 2 ⌋ m = \lfloor k/2 \rfloor m = ⌊ k /2 ⌋ (half-window width for centered moving average)
Boundary handling: when t < m t < m t < m or t > n − m t > n-m t > n − m , calculate the mean using available data
Seasonal Component Extraction (Periodic Average Method)
Calculate the detrended series: d t = x t − Trend t d_t = x_t - \text{Trend}_t d t = x t − Trend t
For each periodic position j j j (j = 0 , 1 , … , 11 j=0,1,\ldots,11 j = 0 , 1 , … , 11 ), compute the average:
s j = 1 N j ∑ k = 0 N j − 1 d j + 12 k s_j = \frac{1}{N_j} \sum_{k=0}^{N_j-1} d_{j+12k}
s j = N j 1 k = 0 ∑ N j − 1 d j + 12 k
where N j N_j N j is the number of occurrences of periodic position j j j
Zero-mean adjustment:
Seasonal j = s j − s ˉ , s ˉ = 1 12 ∑ j = 0 11 s j \text{Seasonal}_j = s_j - \bar{s}, \quad \bar{s} = \frac{1}{12}\sum_{j=0}^{11} s_j
Seasonal j = s j − s ˉ , s ˉ = 12 1 j = 0 ∑ 11 s j
Construct the complete seasonal series: Seasonal t = Seasonal t m o d 12 \text{Seasonal}_t = \text{Seasonal}_{t \mod 12} Seasonal t = Seasonal t mod 12
Residual Calculation
ε t = x t − Trend t − Seasonal t \varepsilon_t = x_t - \text{Trend}_t - \text{Seasonal}_t
ε t = x t − Trend t − Seasonal t
Time Series Equation
x t = Trend t + Seasonal t m o d 12 + ε t x_t = \text{Trend}_t + \text{Seasonal}_{t \mod 12} + \varepsilon_t
x t = Trend t + Seasonal t mod 12 + ε t
where ε t \varepsilon_t ε t is random noise with mean 0.
Question 2
Consider the time series
x t = β 1 + β 2 t + w t x_t = \beta_1 + \beta_2 t + w_t
x t = β 1 + β 2 t + w t
where β 1 \beta_1 β 1 and β 2 \beta_2 β 2 are known constants and w_t is a white noise process with variance σ w 2 \sigma_w^2 σ w 2 .
(a) Determine whether x t x_t x t is stationary.
(b) Show that the process y t = x t − x t − 1 y_t = x_t - x_{t-1} y t = x t − x t − 1 is stationary.
© Show that the mean of the moving average
v t = 1 2 q + 1 ∑ j = − q q x t − j v_t = \frac{1}{2q+1} \sum_{j=-q}^{q} x_{t-j}
v t = 2 q + 1 1 j = − q ∑ q x t − j
is β 1 \beta_1 β 1 + β 2 \beta_2 β 2 t, and give a simplified expression for the autocovariance function.
(a) Determining the Stationarity of x t x_t x t
x t x_t x t is not a stationary process.
Mean function: E [ x t ] = β 1 + β 2 t E[x_t] = \beta_1 + \beta_2 t E [ x t ] = β 1 + β 2 t (varies with time)
Autocovariance function: γ x ( h ) = { σ w 2 h = 0 0 h ≠ 0 \gamma_x(h) = \begin{cases}
\sigma_w^2 & h=0 \\
0 & h \neq 0
\end{cases} γ x ( h ) = { σ w 2 0 h = 0 h = 0
Although the autocovariance depends only on the time difference h h h , the mean is not constant, thus failing to satisfy the stationarity condition.
(b) Proving y t = x t − x t − 1 y_t = x_t - x_{t-1} y t = x t − x t − 1 is Stationary
y t = β 2 + w t − w t − 1 y_t = \beta_2 + w_t - w_{t-1} y t = β 2 + w t − w t − 1 is a stationary process.
Mean:
E [ y t ] = β 2 E[y_t] = \beta_2 E [ y t ] = β 2 (constant)
Autocovariance Function:
γ y ( h ) = { 2 σ w 2 h = 0 − σ w 2 ∣ h ∣ = 1 0 ∣ h ∣ > 1 \gamma_y(h) = \begin{cases}
2\sigma_w^2 & h=0 \\
-\sigma_w^2 & |h|=1 \\
0 & |h|>1
\end{cases} γ y ( h ) = ⎩ ⎨ ⎧ 2 σ w 2 − σ w 2 0 h = 0 ∣ h ∣ = 1 ∣ h ∣ > 1
Depends only on the time difference h h h , satisfying the stationarity condition.
© Mean and Autocovariance of the Moving Average v t v_t v t
v t = 1 2 q + 1 ∑ j = − q q x t − j v_t = \frac{1}{2q+1} \sum_{j=-q}^{q} x_{t-j}
v t = 2 q + 1 1 j = − q ∑ q x t − j
∑ j = − q q E [ x t − j ] = ∑ ( β 1 + β 2 ( t − j ) ) = ( 2 q + 1 ) ( β 1 + β 2 t ) \sum_{j=-q}^{q} E[x_{t-j}] = \sum (\beta_1 + \beta_2(t-j)) = (2q+1)(\beta_1 + \beta_2 t)
j = − q ∑ q E [ x t − j ] = ∑ ( β 1 + β 2 ( t − j )) = ( 2 q + 1 ) ( β 1 + β 2 t )
Autocovariance Function:
Number of non-zero terms N ( h ) = 2 q + 1 − ∣ h ∣ N(h) = 2q+1 - |h| N ( h ) = 2 q + 1 − ∣ h ∣ (when ∣ h ∣ ≤ 2 q |h|\leq 2q ∣ h ∣ ≤ 2 q )
Each term’s covariance is σ w 2 \sigma_w^2 σ w 2 , denominator is ( 2 q + 1 ) 2 (2q+1)^2 ( 2 q + 1 ) 2
γ v ( h ) = { 2 q + 1 − ∣ h ∣ ( 2 q + 1 ) 2 σ w 2 ∣ h ∣ ≤ 2 q 0 ∣ h ∣ > 2 q \gamma_v(h) = \begin{cases}
\frac{2q+1-|h|}{(2q+1)^2} \sigma_w^2 & |h| \leq 2q \\
0 & |h| > 2q
\end{cases} γ v ( h ) = { ( 2 q + 1 ) 2 2 q + 1 − ∣ h ∣ σ w 2 0 ∣ h ∣ ≤ 2 q ∣ h ∣ > 2 q