Atjaunināt sīkdatņu piekrišanu

E-grāmata: Introduction to Linear Regression Analysis

4.15/5 (110 ratings by Goodreads)
(The Coca-Cola Company), (Virginia Polytechnic and State University), (Arizona State University)
Citas grāmatas par šo tēmu:
  • Formāts - EPUB+DRM
  • Cena: 148,71 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

Praise for the Fourth Edition "As with previous editions, the authors have produced a leading textbook on regression." Journal of the American Statistical Association

A comprehensive and up-to-date introduction to the fundamentals of regression analysis

Introduction to Linear Regression Analysis, Fifth Edition continues to present both the conventional and less common uses of linear regression in todays cutting-edge scientific research. The authors blend both theory and application to equip readers with an understanding of the basic principles needed to apply regression model-building techniques in various fields of study, including engineering, management, and the health sciences.

Following a general introduction to regression modeling, including typical applications, a host of technical tools are outlined such as basic inference procedures, introductory aspects of model adequacy checking, and polynomial regression models and their variations. The book then discusses how transformations and weighted least squares can be used to resolve problems of model inadequacy and also how to deal with influential observations. The Fifth Edition features numerous newly added topics, including:





 A chapter on regression analysis of time series data that presents the Durbin-Watson test and other techniques for detecting autocorrelation as well as parameter estimation in time series regression models Regression models with random effects in addition to a discussion on subsampling and the importance of the mixed model Tests on individual regression coefficients and subsets of coefficients Examples of current uses of simple linear regression models and the use of multiple regression models for understanding patient satisfaction data.

In addition to Minitab, SAS, and S-PLUS, the authors have incorporated JMP and the freely available R software to illustrate the discussed techniques and procedures in this new edition. Numerous exercises have been added throughout, allowing readers to test their understanding of the material.

Introduction to Linear Regression Analysis, Fifth Edition is an excellent book for statistics and engineering courses on regression at the upper-undergraduate and graduate levels. The book also serves as a valuable, robust resource for professionals in the fields of engineering, life and biological sciences, and the social sciences.

Recenzijas

The book can be used for statistics and engineering courses on regression at the upper-undergraduate and graduate levels. It also serves as a resource for professionals in the fields of engineering, life and biological sciences, and the social sciences.  (Zentralblatt MATH, 1 October 2013)

 

Preface xiii
1 Introduction
1(11)
1.1 Regression and Model Building
1(4)
1.2 Data Collection
5(4)
1.3 Uses of Regression
9(1)
1.4 Role of the Computer
10(2)
2 Simple Linear Regression
12(55)
2.1 Simple Linear Regression Model
12(1)
2.2 Least-Squares Estimation of the Parameters
13(9)
2.2.1 Estimation of β0 and β1
13(5)
2.2.2 Properties of the Least-Squares Estimators and the Fitted Regression Model
18(2)
2.2.3 Estimation of σ2
20(2)
2.2.4 Alternate Form of the Model
22(1)
2.3 Hypothesis Testing on the Slope and Intercept
22(7)
2.3.1 Use of t Tests
22(2)
2.3.2 Testing Significance of Regression
24(1)
2.3.3 Analysis of Variance
25(4)
2.4 Interval Estimation in Simple Linear Regression
29(4)
2.4.1 Confidence Intervals on β0, β1 and σ2
29(1)
2.4.2 Interval Estimation of the Mean Response
30(3)
2.5 Prediction of New Observations
33(2)
2.6 Coefficient of Determination
35(2)
2.7 A Service Industry Application of Regression
37(2)
2.8 Using SAS® and R for Simple Linear Regression
39(3)
2.9 Some Considerations in the Use of Regression
42(3)
2.10 Regression Through the Origin
45(6)
2.11 Estimation by Maximum Likelihood
51(1)
2.12 Case Where the Regressor x is Random
52(15)
2.12.1 x and y Jointly Distributed
53(1)
2.12.2 x and y Jointly Normally Distributed: Correlation Model
53(5)
Problems
58(9)
3 Multiple Linear Regression
67(62)
3.1 Multiple Regression Models
67(3)
3.2 Estimation of the Model Parameters
70(14)
3.2.1 Least-Squares Estimation of the Regression Coefficients
71(6)
3.2.2 Geometrical Interpretation of Least Squares
77(2)
3.2.3 Properties of the Least-Squares Estimators
79(1)
3.2.4 Estimation of σ2
80(2)
3.2.5 Inadequacy of Scatter Diagrams in Multiple Regression
82(1)
3.2.6 Maximum-Likelihood Estimation
83(1)
3.3 Hypothesis Testing in Multiple Linear Regression
84(13)
3.3.1 Test for Significance of Regression
84(4)
3.3.2 Tests on Individual Regression Coefficients and Subsets of Coefficients
88(5)
3.3.3 Special Case of Orthogonal Columns in X
93(2)
3.3.4 Testing the General Linear Hypothesis
95(2)
3.4 Confidence Intervals in Multiple Regression
97(7)
3.4.1 Confidence Intervals on the Regression Coefficients
98(1)
3.4.2 CI Estimation of the Mean Response
99(1)
3.4.3 Simultaneous Confidence Intervals on Regression Coefficients
100(4)
3.5 Prediction of New Observations
104(1)
3.6 A Multiple Regression Model for the Patient Satisfaction Data
104(2)
3.7 Using SAS and R for Basic Multiple Linear Regression
106(1)
3.8 Hidden Extrapolation in Multiple Regression
107(4)
3.9 Standardized Regression Coefficients
111(6)
3.10 Multicollinearity
117(2)
3.11 Why Do Regression Coefficients Have the Wrong Sign?
119(10)
Problems
121(8)
4 Model Adequacy Checking
129(42)
4.1 Introduction
129(1)
4.2 Residual Analysis
130(21)
4.2.1 Definition of Residuals
130(1)
4.2.2 Methods for Scaling Residuals
130(6)
4.2.3 Residual Plots
136(7)
4.2.4 Partial Regression and Partial Residual Plots
143(3)
4.2.5 Using Minitab®, SAS, and R for Residual Analysis
146(3)
4.2.6 Other Residual Plotting and Analysis Methods
149(2)
4.3 PRESS Statistic
151(1)
4.4 Detection and Treatment of Outliers
152(4)
4.5 Lack of Fit of the Regression Model
156(15)
4.5.1 Formal Test for Lack of Fit
156(4)
4.5.2 Estimation of Pure Error from Near Neighbors
160(5)
Problems
165(6)
5 Transformations And Weighting To Correct Model Inadequacies
171(40)
5.1 Introduction
171(1)
5.2 Variance-Stabilizing Transformations
172(4)
5.3 Transformations to Linearize the Model
176(6)
5.4 Analytical Methods for Selecting a Transformation
182(6)
5.4.1 Transformations on y: The Box--Cox Method
182(2)
5.4.2 Transformations on the Regressor Variables
184(4)
5.5 Generalized and Weighted Least Squares
188(6)
5.5.1 Generalized Least Squares
188(2)
5.5.2 Weighted Least Squares
190(1)
5.5.3 Some Practical Issues
191(3)
5.6 Regression Models with Random Effect
194(17)
5.6.1 Subsampling
194(4)
5.6.2 The General Situation for a Regression Model with a Single Random Effect
198(4)
5.6.3 The Importance of the Mixed Model in Regression
202(1)
Problems
202(9)
6 Diagnostics For Leverage And Influence
211(12)
6.1 Importance of Detecting Influential Observations
211(1)
6.2 Leverage
212(3)
6.3 Measures of Influence: Cook's D
215(2)
6.4 Measures of Influence: DFFITS and DFBETAS
217(2)
6.5 A Measure of Model Performance
219(1)
6.6 Detecting Groups of Influential Observations
220(1)
6.7 Treatment of Influential Observations
220(3)
Problems
221(2)
7 Polynomial Regression Models
223(37)
7.1 Introduction
223(1)
7.2 Polynomial Models in One Variable
223(13)
7.2.1 Basic Principles
223(6)
7.2.2 Piecewise Polynomial Fitting (Splines)
229(6)
7.2.3 Polynomial and Trigonometric Terms
235(1)
7.3 Nonparametric Regression
236(6)
7.3.1 Kernel Regression
237(1)
7.3.2 Locally Weighted Regression (Loess)
237(4)
7.3.3 Final Cautions
241(1)
7.4 Polynomial Models in Two or More Variables
242(6)
7.5 Orthogonal Polynomials
248(12)
Problems
254(6)
8 Indicator Variables
260(25)
8.1 General Concept of Indicator Variables
260(13)
8.2 Comments on the Use of Indicator Variables
273(2)
8.2.1 Indicator Variables versus Regression on Allocated Codes
273(1)
8.2.2 Indicator Variables as a Substitute for a Quantitative Regressor
274(1)
8.3 Regression Approach to Analysis of Variance
275(10)
Problems
280(5)
9 Multicollinearity
285(42)
9.1 Introduction
285(1)
9.2 Sources of Multicollinearity
286(2)
9.3 Effects of Multicollinearity
288(4)
9.4 Multicollinearity Diagnostics
292(11)
9.4.1 Examination of the Correlation Matrix
292(4)
9.4.2 Variance Inflation Factors
296(1)
9.4.3 Eigensystem Analysis of X'X
297(5)
9.4.4 Other Diagnostics
302(1)
9.4.5 SAS and R Code for Generating Multicollinearity Diagnostics
303(1)
9.5 Methods for Dealing with Multicollinearity
303(18)
9.5.1 Collecting Additional Data
303(1)
9.5.2 Model Respecification
304(1)
9.5.3 Ridge Regression
304(9)
9.5.4 Principal-Component Regression
313(6)
9.5.5 Comparison and Evaluation of Biased Estimators
319(2)
9.6 Using SAS to Perform Ridge and Principal-Component Regression
321(6)
Problems
323(4)
10 Variable Selection And Model Building
327(45)
10.1 Introduction
327(11)
10.1.1 Model-Building Problem
327(2)
10.1.2 Consequences of Model Misspecification
329(3)
10.1.3 Criteria for Evaluating Subset Regression Models
332(6)
10.2 Computational Techniques for Variable Selection
338(13)
10.2.1 All Possible Regressions
338(6)
10.2.2 Stepwise Regression Methods
344(7)
10.3 Strategy for Variable Selection and Model Building
351(3)
10.4 Case Study: Gorman and Toman Asphalt Data Using SAS
354(18)
Problems
367(5)
11 Validation Of Regression Models
372(17)
11.1 Introduction
372(1)
11.2 Validation Techniques
373(12)
11.2.1 Analysis of Model Coefficients and Predicted Values
373(2)
11.2.2 Collecting Fresh Data---Confirmation Runs
375(2)
11.2.3 Data Splitting
377(8)
11.3 Data from Planned Experiments
385(4)
Problems
386(3)
12 Introduction To Nonlinear Regression
389(32)
12.1 Linear and Nonlinear Regression Models
389(2)
12.1.1 Linear Regression Models
389(1)
12.2.2 Nonlinear Regression Models
390(1)
12.2 Origins of Nonlinear Models
391(4)
12.3 Nonlinear Least Squares
395(2)
12.4 Transformation to a Linear Model
397(3)
12.5 Parameter Estimation in a Nonlinear System
400(9)
12.5.1 Linearization
400(7)
12.5.2 Other Parameter Estimation Methods
407(1)
12.5.3 Starting Values
408(1)
12.6 Statistical Inference in Nonlinear Regression
409(2)
12.7 Examples of Nonlinear Regression Models
411(1)
12.8 Using SAS and R
412(9)
Problems
416(5)
13 Generalized Linear Models
421(53)
13.1 Introduction
421(1)
13.2 Logistic Regression Models
422(22)
13.2.1 Models with a Binary Response Variable
422(1)
13.2.2 Estimating the Parameters in a Logistic Regression Model
423(5)
13.2.3 Interpretation of the Parameters in a Logistic Regression Model
428(2)
13.2.4 Statistical Inference on Model Parameters
430(10)
13.2.5 Diagnostic Checking in Logistic Regression
440(2)
13.2.6 Other Models for Binary Response Data
442(1)
13.2.7 More Than Two Categorical Outcomes
442(2)
13.3 Poisson Regression
444(6)
13.4 The Generalized Linear Model
450(24)
13.4.1 Link Functions and Linear Predictors
451(1)
13.4.2 Parameter Estimation and Inference in the GLM
452(2)
13.4.3 Prediction and Estimation with the GLM
454(2)
13.4.4 Residual Analysis in the GLM
456(2)
13.4.5 Using R to Perform GLM Analysis
458(3)
13.4.6 Overdispersion
461(1)
Problems
462(12)
14 Regression Analysis Of Time Series Data
474(26)
14.1 Introduction to Regression Models for Time Series Data
474(1)
14.2 Detecting Autocorrelation: The Durbin-Watson Test
475(5)
14.3 Estimating the Parameters in Time Series Regression Models
480(20)
Problems
496(4)
15 Other Topics In The Use Of Regression Analysis
500(41)
15.1 Robust Regression
500(11)
15.1.1 Need for Robust Regression
500(3)
15.1.2 M-Estimators
503(7)
15.1.3 Properties of Robust Estimators
510(1)
15.2 Effect of Measurement Errors in the Regressors
511(2)
15.2.1 Simple Linear Regression
511(2)
15.2.2 The Berkson Model
513(1)
15.3 Inverse Estimation---The Calibration Problem
513(4)
15.4 Bootstrapping in Regression
517(7)
15.4.1 Bootstrap Sampling in Regression
518(1)
15.4.2 Bootstrap Confidence Intervals
519(5)
15.5 Classification and Regression Trees (CART)
524(2)
15.6 Neural Networks
526(3)
15.7 Designed Experiments for Regression
529(12)
Problems
537(4)
APPENDIX A STATISTICAL TABLES
541(12)
APPENDIX B DATA SETS FOR EXERCISES
553(21)
APPENDIX C SUPPLEMENTAL TECHNICAL MATERIAL
574(39)
C.1 Background on Basic Test Statistics
574(3)
C.2 Background from the Theory of Linear Models
577(4)
C.3 Important Results on SSR and SSRes
581(6)
C.4 Gauss-Markov Theorem, Var(ε) = σ2I
587(2)
C.5 Computational Aspects of Multiple Regression
589(1)
C.6 Result on the Inverse of a Matrix
590(1)
C.7 Development of the PRESS Statistic
591(2)
C.8 Development of S2(i)
593(1)
C.9 Outlier Test Based on R-Student
594(2)
C.10 Independence of Residuals and Fitted Values
596(1)
C.11 Gauss--Markov Theorem, Var(ε) = V
597(2)
C.12 Bias in MSRes When the Model Is Underspecified
599(1)
C.13 Computation of Influence Diagnostics
600(1)
C.14 Generalized Linear Models
601(12)
APPENDIX D INTRODUCTION TO SAS
613(10)
D.1 Basic Data Entry
614(4)
D.2 Creating Permanent SAS Data Sets
618(1)
D.3 Importing Data from an EXCEL File
619(1)
D.4 Output Command
620(1)
D.5 Log File
620(2)
D.6 Adding Variables to an Existing SAS Data Set
622(1)
APPENDIX E INTRODUCTION TO R TO PERFORM LINEAR REGRESSION ANALYSIS
623(5)
E.1 Basic Background on R
623(1)
E.2 Basic Data Entry
624(2)
E.3 Brief Comments on Other Functionality in R
626(1)
E.4 R Commander
627(1)
References 628(14)
Index 642
DOUGLAS C. MONTGOMERY, PHD, is Regents Professor of Industrial Engineering and Statistics at Arizona State University. Dr. Montgomery is a Fellow of the American Statistical Association, the American Society for Quality, the Royal Statistical Society, and the Institute of Industrial Engineers and has more than thirty years of academic and consulting experience. He has devoted his research to engineering statistics, specifically the design and analysis of experiments, statistical methods for process monitoring and optimization, and the analysis of time-oriented data. Dr. Montgomery is the coauthor of Generalized Linear Models: With Applications in Engineering and the Sciences, Second Edition and Introduction to Time Series Analysis and Forecasting, both published by Wiley.

ELIZABETH A. PECK, PHD, is Logistics Modeling Specialist at the Coca-Cola Company in Atlanta, Georgia.

G. GEOFFREY VINING, PHD, is Professor in the Department of Statistics at Virginia Polytechnic and State University. He has published extensively in his areas of research interest, which include experimental design and analysis for quality improvement, response surface methodology, and statistical process control. A Fellow of the American Statistical Association and the American Society for Quality, Dr. Vining is the coauthor of Generalized Linear Models: With Applications in Engineering and the Sciences, Second Edition (Wiley).