Atjaunināt sīkdatņu piekrišanu

E-grāmata: Applied Time Series Analysis with R

(Southern Methodist University, Dallas, Texas, USA), (Southern Methodist University, Dallas, Texas, USA), (University of Texas Southwestern Medical Center at Dallas, USA)
  • Formāts: 634 pages
  • Izdošanas datums: 17-Feb-2017
  • Izdevniecība: CRC Press Inc
  • Valoda: eng
  • ISBN-13: 9781498734318
Citas grāmatas par šo tēmu:
  • Formāts - EPUB+DRM
  • Cena: 57,60 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
  • Formāts: 634 pages
  • Izdošanas datums: 17-Feb-2017
  • Izdevniecība: CRC Press Inc
  • Valoda: eng
  • ISBN-13: 9781498734318
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

Virtually any random process developing chronologically can be viewed as a time series. In economics closing prices of stocks, the cost of money, the jobless rate, and retail sales are just a few examples of many. Developed from course notes and extensively classroom-tested, Applied Time Series Analysis with R, Second Edition includes examples across a variety of fields, develops theory, and provides an R-based software package to aid in addressing time series problems in a broad spectrum of fields. The material is organized in an optimal format for graduate students in statistics as well as in the natural and social sciences to learn to use and understand the tools of applied time series analysis.

Features











Gives readers the ability to actually solve significant real-world problems





Addresses many types of nonstationary time series and cutting-edge methodologies





Promotes understanding of the data and associated models rather than viewing it as the output of a "black box"





Provides the R package tswge available on CRAN which contains functions and over 100 real and simulated data sets to accompany the book. Extensive help regarding the use of tswge functions is provided in appendices and on an associated website.





Over 150 exercises and extensive support for instructors

The second edition includes additional real-data examples, uses R-based code that helps students easily analyze data, generate realizations from models, and explore the associated characteristics. It also adds discussion of new advances in the analysis of long memory data and data with time-varying frequencies (TVF).

Recenzijas

What an extraordinary range of topics this book covers, all very insightfully. I like [ the authors] innovations very much, including the AR factor table. David Findley, Senior Mathematical Statistician, US Census Bureau (retired)

" impressive coverage of the scope of time series analysis in both frequency and time domain I commend the authors for having included a number of topics on nonstationary processes (e.g., time-varying spectrum, wavelets), ...an excellent textbook Hernando Ombao, Journal of the American Statistical Association

". . . the book is a good introductory or reference text for practitioners or those new to time series analysis. The chapters are easy to read, and the distinction between applied and theoretical examples throughout helps to cement knowledge for these two distinct groups." Rebecca Killick, Mathematics & Statistics Department, Lancaster University

" . . . this book has much to recommend it for that audience. Coverage is quite thorough and up to date. There is an emphasis on the selection and evaluation of models which is very welcome, and not always found in statistics textbooks directed at non-statisticians." Robert W. Hayden, Mathematical Association of America

"I find the structure of the book very convincing: First, the more basic models are spelled out, second, the forecasting purpose is dealt with, third, estimation and related inferential issues are covered, before an extension (to the multivariate case and more demanding models) is tackled. Each chapter concludes with an exercise section, typically containing theoretical problems as well as applied problems, where the latter build on R; moreover, R commands are explained in separate sections. Further, the book contains over 100 examples." Uwe Hassler, Stat Papers What an extraordinary range of topics this book covers, all very insightfully. I like [ the authors] innovations very much, including the AR factor table. David Findley, Senior Mathematical Statistician, US Census Bureau (retired)

" impressive coverage of the scope of time series analysis in both frequency and time domain I commend the authors for having included a number of topics on nonstationary processes (e.g., time-varying spectrum, wavelets), ...an excellent textbook Hernando Ombao, Journal of the American Statistical Association

". . . the book is a good introductory or reference text for practitioners or those new to time series analysis. The chapters are easy to read, and the distinction between applied and theoretical examples throughout helps to cement knowledge for these two distinct groups." Rebecca Killick, Mathematics & Statistics Department, Lancaster University

" . . . this book has much to recommend it for that audience. Coverage is quite thorough and up to date. There is an emphasis on the selection and evaluation of models which is very welcome, and not always found in statistics textbooks directed at non-statisticians." Robert W. Hayden, Mathematical Association of America

"I find the structure of the book very convincing: First, the more basic models are spelled out, second, the forecasting purpose is dealt with, third, estimation and related inferential issues are covered, before an extension (to the multivariate case and more demanding models) is tackled. Each chapter concludes with an exercise section, typically containing theoretical problems as well as applied problems, where the latter build on R; moreover, R commands are explained in separate sections. Further, the book contains over 100 examples." Uwe Hassler, Stat Papers

Preface for Second Edition xiii
Acknowledgments xv
1 Stationary Time Series
1(60)
1.1 Time Series
3(2)
1.2 Stationary Time Series
5(2)
1.3 Autocovariance and Autocorrelation Functions for Stationary Time Series
7(4)
1.4 Estimation of the Mean, Autocovariance, and Autocorrelation for Stationary Time Series
11(11)
1.4.1 Estimation of μ
12(1)
1.4.1.1 Ergodicity of X
12(5)
1.4.1.2 Variance of X
17(1)
1.4.2 Estimation of γk
18(2)
1.4.3 Estimation of ρk
20(2)
1.5 Power Spectrum
22(10)
1.6 Estimating the Power Spectrum and Spectral Density for Discrete Time Series
32(4)
1.7 Time Series Examples
36(25)
1.7.1 Simulated Data
36(5)
1.7.2 Real Data
41(5)
Appendix 1A Fourier Series
46(2)
Appendix 1B R Commands
48(5)
Exercises
53(8)
2 Linear Filters
61(22)
2.1 Introduction to Linear Filters
61(2)
2.1.1 Relationship between the Spectra of the Input and Output of a Linear Filter
63(1)
2.2 Stationary General Linear Processes
63(3)
2.2.1 Spectrum and Spectral Density for a General Linear Process
65(1)
2.3 Wold Decomposition Theorem
66(1)
2.4 Filtering Applications
66(17)
2.4.1 Butterworth Filters
69(6)
Appendix 2A Theorem Poofs
75(3)
Appendix 2B R Commands
78(1)
Exercises
78(5)
3 ARMA Time Series Models
83(98)
3.1 MA Processes
83(6)
3.1.1 MA(1) Model
86(2)
3.1.2 MA(2) Model
88(1)
3.2 AR Processes
89(44)
3.2.1 Inverting the Operator
93(1)
3.2.2 AR(1) Model
94(6)
3.2.3 AR(p) Model for p ≥ 1
100(1)
3.2.4 Autocorrelations of an AR(p) Model
101(1)
3.2.5 Linear Difference Equations
102(3)
3.2.6 Spectral Density of an AR(p) Model
105(1)
3.2.7 AR(2) Model
105(1)
3.2.7.1 Autocorrelations of an AR(2) Model
105(4)
3.2.7.2 Spectral Density of an AR(2)
109(1)
3.2.7.3 Stationary/Causal Region of an AR(2)
109(1)
3.2.7.4 ψ-Weights of an AR(2) Model
109(8)
3.2.8 Summary of AR(1) and AR(2) Behavior
117(2)
3.2.9 AR(p) Model
119(3)
3.2.10 AR(1) and AR(2) Building Blocks of an AR(p) Model
122(2)
3.2.11 Factor Tables
124(7)
3.2.12 Invertibility/Infinite-Order AR Processes
131(1)
3.2.13 Two Reasons for Imposing Invertibility
132(1)
3.3 ARMA Processes
133(13)
3.3.1 Stationarity and Invertibility Conditions for an ARMA(p,q) Model
136(1)
3.3.2 Spectral Density of an ARMA(p,q) Model
136(1)
3.3.3 Factor Tables and AKMA(p,q) Models
137(3)
3.3.4 Autocorrelations of an ARMA(p,q) Model
140(4)
3.3.5 ψ-Weights of an ARMA(p,q)
144(2)
3.3.6 Approximating ARMA(p,q) Processes Using High-Order AR(p) Models
146(1)
3.4 Visualizing AR Components
146(3)
3.5 Seasonal ARMA(p,q) x (PS/QS)S Models
149(6)
3.6 Generating Realizations from ARMA(p,q) Processes
155(2)
3.6.1 MA(q) Model
155(1)
3.6.2 AR(2) Model
155(1)
3.6.3 General Procedure
156(1)
3.7 Transformations
157(24)
3.7.1 Memoryless Transformations
157(1)
3.7.2 AR Transformations
158(3)
Appendix 3A Proofs of Theorems
161(5)
Appendix 3B R Commands
166(6)
Exercises
172(9)
4 Other Stationary Time Series Models
181(24)
4.1 Stationary Harmonic Models
181(10)
4.1.1 Pure Harmonic Models
183(2)
4.1.2 Harmonic Signal-Plus-Noise Models
185(2)
4.1.3 ARMA Approximation to the Harmonic Signal-Plus-Noise Model
187(4)
4.2 ARCH and GARCH Processes
191(14)
4.2.1 ARCH Processes
193(1)
4.2.1.1 The ARCH(1) Model
193(3)
4.2.1.2 The ARCH(qo) Model
196(1)
4.2.2 The GARCH(po,qo) Process
197(2)
4.2.3 AR Processes with ARCH or GARCH Noise
199(2)
Appendix 4A R Commands
201(1)
Exercises
202(3)
5 Nonstationary Time Series Models
205(24)
5.1 Deterministic Signal-Plus-Noise Models
205(5)
5.1.1 Trend-Component Models
206(2)
5.1.2 Harmonic Component Models
208(2)
5.2 ARIMA(p,d,q) and ARUMA(p,d,q) Processes
210(7)
5.2.1 Extended Autocorrelations of an ARUMA(p,d,q) Process
211(6)
5.2.2 Cyclical Models
217(1)
5.3 Multiplicative Seasonal ARUMA (p,d,q) × (Ps, Ds, Qs)s Process
217(3)
5.3.1 Factor Tables for Seasonal Models of the Form of Equation 5.17 with s = 4 and s = 12
218(2)
5.4 Random Walk Models
220(1)
5.4.1 Random Walk
220(1)
5.4.2 Random Walk with Drift
221(1)
5.5 G-Stationary Models for Data with Time-Varying Frequencies
221(8)
Appendix 5A R Commands
222(3)
Exercises
225(4)
6 Forecasting
229(44)
6.1 Mean-Square Prediction Background
230(2)
6.2 Box--Jenkins Forecasting for AKMA(p,q) Models
232(1)
6.2.1 General Linear Process Form of the Best Forecast Equation
233(1)
6.3 Properties of the Best Forecast Xt0 (l)
233(2)
6.4 π-Weight Form of the Forecast Function
235(1)
6.5 Forecasting Based on the Difference Equation
236(6)
6.5.1 Difference Equation Form of the Best Forecast Equation
237(1)
6.5.2 Basic Difference Equation Form for Calculating Forecasts from an ARMA(p,q) Model
238(4)
6.6 Eventual Forecast Function
242(1)
6.7 Assessing Forecast Performance
243(5)
6.7.1 Probability Limits for Forecasts
243(4)
6.7.2 Forecasting the Last k Values
247(1)
6.8 Forecasts Using ARUMA(p,d,q) Models
248(7)
6.9 Forecasts Using Multiplicative Seasonal ARUMA Models
255(4)
6.10 Forecasts Based on Signal--Plus--Noise Models
259(14)
Appendix 6A Proof of Projection Theorem
262(2)
Appendix 6B Basic Forecasting Routines
264(4)
Exercises
268(5)
7 Parameter Estimation
273(48)
7.1 Introduction
273(1)
7.2 Preliminary Estimates
274(12)
7.2.1 Preliminary Estimates for AR(p) Models
274(1)
7.2.1.1 Yule--Walker Estimates
274(2)
7.2.1.2 Least Squares Estimation
276(2)
7.2.1.3 Burg Estimates
278(2)
7.2.2 Preliminary Estimates for MA(q) Models
280(1)
7.2.2.1 MM Estimation for an MA(q)
280(1)
7.2.2.2 MA(q) Estimation Using the Innovations Algorithm
281(2)
7.2.3 Preliminary Estimates for ARMA(p,q) Models
283(1)
7.2.3.1 Extended Yule-Walker Estimates of the AR Parameters
283(1)
7.2.3.2 Tsay--Tiao Estimates of the AR Parameters
284(1)
7.2.3.3 Estimating the MA Parameters
285(1)
7.3 ML Estimation of ARMA(p,q) Parameters
286(6)
7.3.1 Conditional and Unconditional ML Estimation
286(5)
7.3.2 ML Estimation Using the Innovations Algorithm
291(1)
7.4 Backcasting and Estimating σ
292(3)
7.5 Asymptotic Properties of Estimators
295(8)
7.5.1 AR Case
295(1)
7.5.1.1 Confidence Intervals: AR Case
296(1)
7.5.2 ARMA(p,q) Case
297(3)
7.5.2.1 Confidence Intervals for AKMA(p,q) Parameters
300(1)
7.5.3 Asymptotic Comparisons of Estimators for an MA(1)...
301(2)
7.6 Estimation Examples Using Data
303(6)
7.7 ARMA Spectral Estimation
309(4)
7.8 ARUMA Spectral Estimation
313(8)
Appendix
315(2)
Exercises
317(4)
8 Model Identification
321(54)
8.1 Preliminary Check for White Noise
321(3)
8.2 Model Identification for Stationary ARMA Models
324(4)
8.2.1 Model Identification Based on AIC and Related Measures
325(3)
8.3 Model Identification for Nonstationary ARUMA(p,d,q) Models
328(47)
8.3.1 Including a Nonstationary Factor in the Model
330(1)
8.3.2 Identifying Nonstationary Components) in a Model
330(5)
8.3.3 Decision Between a Stationary or a Nonstationary Model
335(1)
8.3.4 Deriving a Final ARUMA Model
335(3)
8.3.5 More on the Identification of Nonstationary Components
338(1)
8.3.5.1 Including a Factor (1 -- B)d in the Model
338(3)
8.3.5.2 Testing for a Unit Root
341(3)
8.3.5.3 Including a Seasonal Factor (1 - Bs) in the Model
344(9)
Appendix 8A Model Identification Based on Pattern Recognition
353(15)
Appendix 8B Model Identification Functions in t swge
368(3)
Exercises
371(4)
9 Model Building
375(24)
9.1 Residual Analysis
375(5)
9.1.1 Check Sample Autocorrelations of Residuals versus 95% Limit Lines
376(1)
9.1.2 Ljung--Box Test
376(1)
9.1.3 Other Tests for Randomness
377(3)
9.1.4 Testing Residuals for Normality
380(1)
9.2 Stationarity versus Nonstationarity
380(6)
9.3 Signal-Plus-Noise versus Purely Autocorrelation-Driven Models
386(3)
9.3.1 Cochrane--Orcutt and Other Methods
386(2)
9.3.2 A Bootstrapping Approach
388(1)
9.3.3 Other Methods for Trend Testing
388(1)
9.4 Checking Realization Characteristics
389(5)
9.5 Comprehensive Analysis of Time Series Data: A Summary
394(5)
Appendix 9A R Commands
395(1)
Exercises
396(3)
10 Vector-Valued (Multivariate) Time Series
399(56)
10.1 Multivariate Time Series Basics
399(2)
10.2 Stationary Multivariate Time Series
401(6)
10.2.1 Estimating the Mean and Covariance for Stationary Multivariate Processes
406(1)
10.2.1.1 Estimating μ
406(1)
10.2.1.2 Estimating Γ(k)
406(1)
10.3 Multivariate (Vector) ARMA Processes
407(14)
10.3.1 Forecasting Using VAR(p) Models
414(2)
10.3.2 Spectrum of a VAR(p) Model
416(1)
10.3.3 Estimating the Coefficients of a VAR(p) Model
416(1)
10.3.3.1 Yule-Walker Estimation
416(1)
10.3.3.2 Least Squares and Conditional ML Estimation
417(1)
10.3.3.3 Burg-Type Estimation
418(1)
10.3.4 Calculating the Residuals and Estimating Γa
418(1)
10.3.5 VAR(p) Spectral Density Estimation
419(1)
10.3.6 Fitting a VAR(p) Model to Data
419(1)
10.3.6.1 Model Selection
419(1)
10.3.6.2 Estimating the Parameters
419(1)
10.3.6.3 Testing the Residuals for White Noise
419(2)
10.4 Nonstationary VARMA Processes
421(1)
10.5 Testing for Association between Time Series
422(7)
10.5.1 Testing for Independence of Two Stationary Time Series
424(3)
10.5.2 Testing for Cointegration between Nonstationary Time Series
427(2)
10.6 State-Space Models
429(26)
10.6.1 State Equation
429(1)
10.6.2 Observation Equation
429(3)
10.6.3 Goals of State-Space Modeling
432(1)
10.6.4 Kalman Filter
433(1)
10.6.4.1 Prediction (Forecasting)
433(1)
10.6.4.2 Filtering
433(1)
10.6.4.3 Smoothing Using the Kalman Filter
434(1)
10.6.4.4 h-Step Ahead Predictions
434(2)
10.6.5 Kalman Filter and Missing Data
436(3)
10.6.6 Parameter Estimation
439(1)
10.6.7 Using State-Space Methods to Find Additive Components of a Univariate AR Realization
440(1)
10.6.7.1 Revised State-Space Model
441(1)
10.6.7.2 Ψj Real
441(1)
10.6.7.3 Ψj Complex
442(1)
Appendix 10A Derivation of State-Space Results
443(6)
Appendix 10B Basic Kalman Filtering Routines
449(3)
Exercises
452(3)
11 Long-Memory Processes
455(44)
11.1 Long Memory
456(1)
11.2 Fractional Difference and FARMA Processes
457(7)
11.3 Gegenbauer and GARMA Processes
464(8)
11.3.1 Gegenbauer Polynomials
464(1)
11.3.2 Gegenbauer Process
465(4)
11.3.3 GARMA Process
469(3)
11.4 it-Factor Gegenbauer and GARMA Processes
472(7)
11.4.1 Calculating Autocovariances
476(2)
11.4.2 Generating Realizations
478(1)
11.5 Parameter Estimation and Model Identification
479(4)
11.6 Forecasting Based on the k-Factor GARMA Model
483(1)
11.7 Testing for Long Memory
484(3)
11.7.1 Testing for Long Memory in the Fractional and FARMA Setting
486(1)
11.7.2 Testing for Long Memory in the Gegenbauer Setting
486(1)
11.8 Modeling Atmospheric CO2 Data Using Long-Memory Models
487(12)
Appendix 11 A R Commands
490(7)
Exercises
497(2)
12 Wavelets
499(48)
12.1 Shortcomings of Traditional Spectral Analysis for TVF Data
499(3)
12.2 Window-Based Methods that Localize the "Spectrum" in Time
502(3)
12.2.1 Gabor Spectrogram
502(3)
12.2.2 Wigner-Ville Spectrum
505(1)
12.3 Wavelet Analysis
505(32)
12.3.1 Fourier Series Background
506(1)
12.3.2 Wavelet Analysis Introduction
506(4)
12.3.3 Fundamental Wavelet Approximation Result
510(2)
12.3.4 Discrete Wavelet Transform for Data Sets of Finite Length
512(3)
12.3.5 Pyramid Algorithm
515(1)
12.3.6 Multiresolution Analysis
516(5)
12.3.7 Wavelet Shrinkage
521(3)
12.3.8 Scalogram: Time-Scale Plot
524(3)
12.3.9 Wavelet Packets
527(7)
12.3.10 Two-Dimensional Wavelets
534(3)
12.4 Concluding Remarks on Wavelets
537(10)
Appendix 12A Mathematical Preliminaries for This
Chapter
539(2)
Appendix 12B Mathematical Preliminaries
541(4)
Exercises
545(2)
13 G-Stationary Processes
547(48)
13.1 Generalized-Stationary Processes
547(2)
13.1.1 General Strategy for Analyzing G-Stationary Processes
548(1)
13.2 M-Stationary Processes
549(7)
13.2.1 Continuous M-Stationary Process
549(2)
13.2.2 Discrete M-Stationary Process
551(1)
13.2.3 Discrete Euler(p) Model
551(1)
13.2.4 Time Transformation and Sampling
552(4)
13.3 G(A)-Stationary Processes
556(17)
13.3.1 Continuous G(p; λ) Model
557(2)
13.3.2 Sampling the Continuous G(λ))-Stationary Processes
559(1)
13.3.2.1 Equally Spaced Sampling from G(p; λ)) Processes
560(1)
13.3.3 Analyzing TVF Data Using the G(p; λ)) Model
561(2)
13.3.3.1 G(p; λ)) Spectral Density
563(10)
13.4 Linear Chirp Processes
573(6)
13.4.1 Models for Generalized Linear Chirps
576(3)
13.5 G-Filtering
579(3)
13.6 Concluding Remarks
582(13)
Appendix 13A G-Stationary Basics
583(4)
Appendix 13B R Commands
587(5)
Exercises
592(3)
References 595(10)
Index 605
Wayne A. Woodward is a professor and chair of the Department of Statistical Science at Southern Methodist University in Dallas, Texas. Henry L. Gray is a C.F. Frensley Professor Emeritus in the Department of Statistical Science at Southern Methodist University in Dallas, Texas. Alan C. Elliott is a biostatistician in the Department of Statistical Science at Southern Methodist University in Dallas, Texas.