intTypePromotion=1
zunia.vn Tuyển sinh 2024 dành cho Gen-Z zunia.vn zunia.vn
ADSENSE

Bài giảng Chapter 5: Inference & prediction

Chia sẻ: Codon_09 Codon_09 | Ngày: | Loại File: PDF | Số trang:0

32
lượt xem
1
download
 
  Download Vui lòng tải xuống để xem tài liệu đầy đủ

Bài giảng Chapter 5: Inference & prediction tập trung trình bày các vấn đề cơ bản về waldtests; least souares discrepancy; the restricted least souares estimator;... Hy vọng tài liệu là nguồn thông tin hữu ích cho quá trình học tập và nghiên cứu của các bạn.

Chủ đề:
Lưu

Nội dung Text: Bài giảng Chapter 5: Inference & prediction

  1. Advanced Econometrics Chapter 5: Inference & Prediction Chapter 5 INFERENCE & PREDICTION I. WALD TESTS: • Nested models: If we can obtain one model from another by imposing restrictions (on the parameters), we say that the Z models are nested. • Non-nested model: If neither model is obtained as a restriction on the parameters of the other model. There two models are non-nested. Example: A Wald test is for choosing between nested models. Yi = β1 + β 2 X i 2 + β 3 X i 3 + ε i  Yi = β1 + β 2 X i 2 + ε i A Wald test is for choosing between non-nested models. Yi = β1 + β 2 X i + β 3 Z i + ε i  Yi = α1 + α 2 H i + α 3 M i + ε i We'll be concerned with (several) possible restrictions on β in the usual model. Y = Xβ + ε ε ~ N (0, σ 2 I ) X: non-stochastic ran(X) = k. The general form of r restrictions: Rβ =q r × k k ×1 r ×1 R & q re known & non-random, assume rank(R) = r (
  2. Advanced Econometrics Chapter 5: Inference & Prediction β1 = 0  → β 2 + β 3 + β 4 = 1 β = β  3 4 r: number of restrictions. II. LEAST SQUARES DISCREPANCY: • Suppose just estimate the model by OLS & obtain: βˆu = ( X ' X ) −1 X ' Y (u: unrestriction). Let m = Rβˆu − q • Let's consider the sampling distribution of m: m = Rβˆu − q (a linear function of βˆu ) r ×1 E ( m ) = E ( Rβˆu − q) = RE ( βˆu ) − q = Rβ − q → E ( m ) = 0 if Rβ = q VarCov( m ) = VarCov( Rβˆu − q) = VarCovRβˆu = RVβˆu R ' = Rσ ε2 ( X ' X ) −1 R ' = σ ε2 R ( X ' X ) −1 R ' So m ~ N (0, σ ε2 R( X ' X ) −1 R' ) if Rβ = q r ×1 r ×r • From the theorem for construction of Hausman's test we have: m ~ N (0, Σ) → m' {VarCov( m )}−1 ~ χ [r2 ] r ×1 So: W = [ Rβˆu − q]' [ R ( X ' X ) −1 R' σ ε2 ]−1 [ Rβˆu − q] ~ χ [r2 ] Under H0: Rβ = q if σ ε2 is known. • Usually we don't know σ ε2 , we can replace σ ε2 by any consistent estimator of σ ε2 , say σ ε2 with p lim σˆ ε2 = σ ε2 then will get the same asymptotic test distribution χ [r2 ] • Reject H0 if W > critical value (this test is asymptotic test). Nam T. Hoang University of New England - Australia 2 University of Economics - HCMC - Vietnam
  3. Advanced Econometrics Chapter 5: Inference & Prediction F-statistics: We can modify the Wald test statistics slightly and get the exact test (not asymptotic) in finite sample: ′ ESS u eu′ eu ε ' Mε  ε   ε  Note that: = = =   M   ~ χ [2n − k ] ( n − k ) ×( n − k )  σ σε σε σ ε2 σε  2 2  ε  [ Rβˆu − q ]' [ R ( X ' X ) −1 R ' σ ε2 ]−1 [ Rβˆu − q] σ ε2 .r We have: Fn − k = r ESS u σ ε2 (n − k ) χ [2r ] r ~ F(r, n-k) = χ2 [n −k ] (n − k ) Note: rank(M) = trace(M) = (n-k). • Calculate F. Reject H0 in favour of Rβ ≠ q if F > Fcritical. • We also can prove that: W = [ Rβˆu − q ]' [ R ( X ' X ) −1 R ' σ ε2 ]−1 [ Rβˆu − q] = ESS R − ESSU = e′R e R − eU′ eU III. THE RESTRICTED LEAST SQUARES ESTIMATOR: If we test the validity of certain linear restrictions on βˆ and we can't reject them, how might we incorporate them into the estimator? Problem: Minimize e'e st: Rβˆu = q (R: restriction). ℘ = [Y − Xβˆ R ]' [Y − Xβˆ R ] + 2λ [ Rβˆ R − q ]' = Y ' Y − 2 βˆ R + βˆ R′ X ' Xβˆ R + 2[ Rβˆ R − q ]' λ ∂℘ = −2 X ' Y + 2 X ' Xβˆ R + 2 R ' λ = 0 (i) ∂βˆR Nam T. Hoang University of New England - Australia 3 University of Economics - HCMC - Vietnam
  4. Advanced Econometrics Chapter 5: Inference & Prediction ∂℘ = 2[ Rβˆ R − q ] ∂λ (ii) → R ( X ' X ) −1 * ( i ) → → − R( X ' X ) −1 X ' Y + R ( X ' X ) −1 X ' X βˆ R + R ( X ' X ) −1 R' λ = 0       βˆu I → Rβˆu − Rβˆ R = R ( X ' X ) −1 R' λ → λ = [ R ( X ' X ) −1 R' λ ]−1 [ Rβˆu − q] Put into (i): → − X ' Y + X ' Xβˆ R + R ' [ R ( X ' X ) −1 R ' λ ]−1 [ Rβˆu − q] = 0 → ( X ' X ) −1 X ' Y + ( X ' X ) −1 X ' X βˆ R + ( X ' X ) −1 R' [ R ( X ' X ) −1 R' λ ]−1 [ Rβˆu − q] = 0       βˆu I → βˆ R = βˆu − ( X ' X ) −1 R ' [ R ( X ' X ) −1 R ' ]−1 [ Rβˆu − q] Theorem: - The RLS estimator βˆ R is unbiased if Rβ = q otherwise it is biased. - The covariance matrix of βˆ R is: VarCov ( βˆ R ) = σ ε2 ( X ' X ) −1 [ I − R ' [ R ( X ' X ) −1 R ' ]−1 [ R ( X ' X ) −1 ] VarCov ( βˆ R ) = σ ε2 ( X ' X ) −1 {I − R ' [ R ( X ' X ) −1 R ' ]−1 R ( X ' X ) −1 ]} Proof: (Exercise) • If the restrictions are valid ( Rβ = q ) then the RLS estimator βˆ R is more efficient than OLS estimator (has smaller variance). • If the restriction are false the βˆ R is not only unbiased, it is also inconsistent → it's good to know how to construct the test (uniform most power) of H0: Rβ = q . Back to Wald test: [ Rβˆu − q ]' [ R ( X ' X ) −1 R ' σ ε2 ]−1 [ Rβˆu − q] (A): Fnr− k = r eu′ eu (n − k ) Nam T. Hoang University of New England - Australia 4 University of Economics - HCMC - Vietnam
  5. Advanced Econometrics Chapter 5: Inference & Prediction (B): βˆ R − βˆu = −( X ' X ) −1 R ' [ R ( X ' X ) −1 R' ]−1 [ Rβˆu − q] e R = Y − Xβˆ R = Y − Xβˆu + Xβˆu − Xβˆ R   eu = eu + X ( βˆu − βˆ R ) → e′R e R = eu′ eu + ( βˆ R − βˆu )' X ' X ( βˆ R − βˆu ) + 0 (since eu′ X = 0) → e′R e R − eu′ eu = [ Rβˆu − q]' [ R ( X ' X ) −1 R' ]−1 R( X ' X ) −1 ( X ' X )( X ' X ) −1 R ' [ R ( X ' X ) −1 R' ]−1 [ Rβˆu − q] Using (B): → e′R e R − eu′ eu = [ Rβˆu − q]' [ R ( X ' X ) −1 R ' ]−1 [ Rβˆu − q ] Put into (A): (e′R eR − eu′ eu ) → F r = r ~ F ( r, n − k ) n −k eu′ eu (n − k ) (same as original F-statistics) (e′R eR − eu′ eu ) r ~ χ[r] 2 → Fnr− k = r σ ε2 → rFnr− k ~ χ [r2 ] same as original χ[r ] statistics. 2 IV. NON-LINEAR RESTRICTIONS: All of the discussion and results so far relate to linear restrictions Rβ = q . What about non-linear restrictions: C(β ) = q C is a function. r ×1 Recall Taylor series expansions: 1 f ( x ) = f ( x0 ) + ( x − x0 ) f ' ( x0 ) + ( x − x 0 ) 2 f ' ( x 0 ) + ... 2! Nam T. Hoang University of New England - Australia 5 University of Economics - HCMC - Vietnam
  6. Advanced Econometrics Chapter 5: Inference & Prediction in our vector case: ′  ∂C ( β )  ˆ C( β ) = C(β ) +  ˆ (  β − β + ... ) k ×1  ∂β  where βˆ is some estimator of β (consistent) ′ [ V C(β ) =  ˆ ]  ∂C ( β )  ∂ β  V β −β ˆ ( ) ∂C∂(ββ )  =     ′  ∂C ( β )  =  ()  ∂C ( β )  V βˆ    ∂β   ∂β  so we can form a Wald test statistics: W = ( cˆ − q)' {est asy cov(cˆ )}−1 ( cˆ − q) −1    ∂C ( β )  ′ ˆ  ∂C ( β )   W = ( cˆ − q )'    V β ()    ( cˆ − q)   ∂β  βˆ   ∂β  βˆ   where cˆ = c( βˆ ) and W  → d χ [2J ] if any consistent estimator of β and if V ( βˆ ) is used. Warning: The value of W is not in variant to the way the non-linear restrictions are written. Ex: β1/β2 = β3 or β1 = β2β3 so, it is possible to get conflicting result → Wald test has this weakness when we have non- linear restrictions. Of course, in the non-linear case, there is no exact, finite-sample test → The F-test does not apply. Nam T. Hoang University of New England - Australia 6 University of Economics - HCMC - Vietnam
ADSENSE

CÓ THỂ BẠN MUỐN DOWNLOAD

 

Đồng bộ tài khoản
2=>2