intTypePromotion=1
zunia.vn Tuyển sinh 2024 dành cho Gen-Z zunia.vn zunia.vn
ADSENSE

Lecture Advanced Econometrics (Part II) - Chapter 9: Autocorrelation

Chia sẻ: Tran Nghe | Ngày: | Loại File: PDF | Số trang:0

45
lượt xem
2
download
 
  Download Vui lòng tải xuống để xem tài liệu đầy đủ

Lecture "Advanced Econometrics (Part II) - Chapter 9: Autocorrelation" presentation of content: Properties of ols estimator under autocorrelation, disturnabce rprocess, estimation under autocorrelation.

Chủ đề:
Lưu

Nội dung Text: Lecture Advanced Econometrics (Part II) - Chapter 9: Autocorrelation

  1. Advanced Econometrics Chapter 9: Autocorrelation Chapter 9 AUTOCORRELATION Non-zero correlation between errors at different observations: E (ε t ε t ) ≠ 0 t ≠ s → violated assumption (4): E(εε') = σε2I because the off-diagonals ≠ 0. Example: log Qt = β1 + β 2 log K t + β 3 log Lt + ε t t = 1,2, ... T In recession Q↓ more than inputs εt < 0 In boom Q↑ more than inputs εt > 0 Autocorrelation, also called serial correlation, can exist in any research study in which the order of the observations has some meaning, it occur most frequently in time-series data. • Pure serial correlation is caused by the underlying distribution of the error term of the true specification of an equation. • Impure serial correlation is caused by a specification error such as an omitted variable or incorrect functional form. • We here study about the pure serial correlation. Nam T. Hoang University of New England - Australia 1 University of Economics - HCMC - Vietnam
  2. Advanced Econometrics Chapter 9: Autocorrelation I. PROPERTIES OF OLS ESTIMATOR UNDER AUTOCORRELATION: 1. βˆOLS is still unbiased. 2. βˆOLS is still consistent. 3. βˆOLS is longer best (efficient), it is less efficient than βˆGLS variances. 4. VarCov( βˆOLS ) ≠ σ ε2 ( X ' X ) −1 : so the standard errors of βˆ j ' s are biased (downward) and inconsistent because they are based on incorrect formula. 5. t-statistic, R2, overall F-statistics upward. II. DISTURBANCE PROCESS: For testing or treatment we need to make more explicit assumption about the type of autocorrelation. The most common is first order autoregressive process [AR(1)]. ε t = ρε t −1 + ut ut satisfies all classical assumptions.  E (u t ) = 0   E (u t ) = σ u 2 2   E (u t u s ) = 0 (t ≠ s ) ρ: coefficient of autocorrelation. |ρ| → stationary of εt. • Covariance stationary of εt: the mean variance and all autocovariances of εt are constant. Autocovariances: Cov[ε t , ε t − s X ] = Cov[ε t + s , ε t X ] = σ 2 Ω t ,t − s = γ s = γ − s So Cov[ε t , ε t − s X ] does not depend on t, only depend on s. ε t = ρε t −1 + ut = ρ [ ρε t − 2 + ut −1 ] + ut = ρ 2ε t − 2 + ρut −1 + ut Nam T. Hoang University of New England - Australia 2 University of Economics - HCMC - Vietnam
  3. Advanced Econometrics Chapter 9: Autocorrelation = ρ 2 [ ρε t − 3 + ut − 2 ] + ρut −1 + ut = ρ 3ε t − 3 + ρ 2 ut − 2 + ρut −1 + ut ... n −1 ε t = ρ n ε t − n + ∑ ρ j ut − j |ρ| < 1 j =0 ∞ n →∞ ε t = ∑ ρ j ut − j → ε unrelated to future of u , u , ... t t t+j j =0 this is called infinite moving average process. Moment of εt : ∞  ∞ E (ε t ) = E ∑ ρ j ut − j  = ∑ ρ j E (ut − j ) = 0  j =0  j = 0  0  Var (ε t ) = E (ε t2 ) = E ( ρε t −1 + ut ) 2 → E (ε t2 ) = E ( ρ 2ε t2−1 + ut2 + 2 ρε t −1ut ) → E (ε t2 ) = ρ 2 E (ε t2−1 ) + E (ut2 ) + 2 ρ E (ε t −1ut )      σ ε2 σ ε2 σ u2 0 → σ ε2 = ρ 2σ ε2 + σ u2 σ u2 → σε = 2 1− ρ 2 Autocovariance: E (ε t ε t −1 ) = E [( ρε t −1 + ut )ε t −1 ] = ρE ( ρε t2−1 ) + E (ut , ε t −1 ) = ρσ ε2  0 Nam T. Hoang University of New England - Australia 3 University of Economics - HCMC - Vietnam
  4. Advanced Econometrics Chapter 9: Autocorrelation → E (ε t ε t − s ) = ρ sσ ε2 Corr(ε t , ε t −1 ) ρσ ε2 Corr(ε t , ε t −1 ) = = =ρ Var (ε t )Var (ε t −1 ) σ ε2 → Corr(ε t , ε t −1 ) = ρ Cov(ε t , ε t − 2 ) = E (ε t , ε t − 2 ) = ρE [ε t −1ε t − 2 ] + E (ut , ε t − 2 ) = ρ 2σ ε2  0 → Corr(ε t , ε t − 2 ) = ρ 2 → Corr(ε t , ε t − s ) = ρ s We can use this to construct the matrix Ω in E(εε') = σ ε Ω 2  1 ρ ρ2  ρ T −1     ρ 1 ρ  ρ T −2  Σ = σε  ρ 2 2 ρ 1  ρ T −3           σ u2 1− ρ 2  ρ T −1 ρ T −2 ρ T −3  1  σ u2 σ ε2 = 1− ρ 2 Cov(ε t ε t − s ) = ρ sσ ε2  s = 1, 2, ..., T-1 Corr(ε t , ε t − s ) = ρ s III. ESTIMATION UNDER AUTOCORRELATION: 1. Estimation with known ρ: We can find GLS estimator: βˆGLS = ( X ' Ω −1 X ) −1 X ' Ω −1Y Find matrix H such that: H'H = Ω-1. Nam T. Hoang University of New England - Australia 4 University of Economics - HCMC - Vietnam
  5. Advanced Econometrics Chapter 9: Autocorrelation HY = HXβ + Hε meets all classical assumptions.  1− ρ 2 0 0 0    −ρ 1 0 0 1 Choose H = 0 −ρ 1 0   1− ρ 2        0  0 0  1  1− ρ 2 0 0  0  Y1   1 − ρ 2 Y1    Y     −ρ 1 0  0 1  2   Y2 − ρY1  → HY =  0 −ρ 1  0  Y 3  =  Y 3 − ρY 2    1− ρ 2                 0  0 0  1 YT    YT − ρYT −1   1 − ρ 2 X 11 1 − ρ 2 X 12 1 − ρ 2 X 13  1 − ρ 2 X 1k     X 21 − ρX 11 X 22 − ρX 12 X 23 − ρX 13  X 2 k − ρX 1k  For HX: HX =  X 31 − ρX 21 X 32 − ρX 22 X 33 − ρX 23  X 3 k − ρX 2 k            X − ρX  T1 T −1,1 X T 2 − ρX T −1, 2 X T 3 − ρX T −1,3  X Tk − ρX T −1,k   X 11  1  X  1  21    X 31  can be = 1          X T −1,1  1  1 − ρ 2 ε1     ε 2 − ρε 1  For Hε Hε =  ε 3 − ρε 2       ε − ρε   T T −1  So transformed model is: Nam T. Hoang University of New England - Australia 5 University of Economics - HCMC - Vietnam
  6. Advanced Econometrics Chapter 9: Autocorrelation k (i) 1 − ρ Y1 = ∑ β j ( 1 − ρ 2 X 1 j ) + 1 − ρ 2 ε 1 2 j =1 k (i) Yt − ρYt −1 = ∑ β j ( X tj − ρX t −1, j ) + ε t − ρε t −1 t = 2, 3, ... T.   j =1     Yt* X tj* ut This is also called "Autoregressive transformation" or "quasi-differencing" "rho- transformation" Note: βˆGLS = ( X ' Ω −1 X ) −1 X ' Ω −1Y 2. Estimation with unknown ρ: Using Cochrane – Orcutt procedure: Y X β + ε by OLS, save et ’s (1) Estimate= (2) Use et ’s to estimate ρ from regression. T ∑e e t t −1 et ρ et −1 + ut → ρˆ = = t =2 T ∑e t =2 2 t −1 (3) Transform the model as in (ii) by quasi-differencing the data and estimate (ii) by OLS. Stop here → Cochrane – Orcutt. Y X β + ε again. (4) Use βˆ j from step 3 to compute new et ’s algebraically from= k e= t Yt − ∑ βˆ j X tj j =1 (5) Repeat step 2 → 4 until convergence ( ρˆ ’s at 2 successive step differ by less than 0.001). Exercise: For AR(2) process. ε t = ρ1ε t −1 + ρ 2ε t − 2 + ut where ut meets classical assumptions.  Define the quasi-differencing that eliminate autocorrelation.  Spell out the iterative Cochrane – Orcutt procedure for this model. Nam T. Hoang University of New England - Australia 6 University of Economics - HCMC - Vietnam
  7. Advanced Econometrics Chapter 9: Autocorrelation Cochrane – Orcutt procedure for AR(2) model: AR(2) process: ε t = ρ1ε t −1 + ρ 2ε t − 2 + ut ut meets classical assumptions: Quasi-differencing: k (Yt − ρ1Yt −1 − ρ2= Yt − 2 ) ∑β j =1 j ( X tj − ρ1 X t −1, j − ρ 2 X t − 2, j ) + ( ε t − ρ1ε t −1 − ρ 2ε t − 2 )   ut (t = 3,4, … T) Procedure: Y X β + ε by OLS, save et ’s (1) Estimate= (2) Estimate ε t = ρ1ε t −1 + ρ 2ε t − 2 + ut by OLS, to get ρˆ1 , ρˆ 2 . (3) Use ρˆ1 , ρˆ 2 to quasi-differencing, estimate β by OLS. Stop here → 2 steps Cochrane – Orcutt. (4) Use βˆ j from step 3 to compute new et ’s algebraically from= Y X βˆ + ε again. (5) Repeat step 2 → 4 until convergence. Nam T. Hoang University of New England - Australia 7 University of Economics - HCMC - Vietnam
ADSENSE

CÓ THỂ BẠN MUỐN DOWNLOAD

 

Đồng bộ tài khoản
2=>2