| |
|
| Artikel-Nr.: 5667A-9783030291662 Herst.-Nr.: 9783030291662 EAN/GTIN: 9783030291662 |
| |
|
| | |
| Now in its third edition, this companion volume to Ronald Christensen's uses three fundamental concepts from standard linear model theory--best linear prediction, projections, and Mahalanobis distance-- to extend standard linear modeling into the realms of Statistical Learning and Dependent Data. This new edition features a wealth of new and revised content. In Statistical Learning it delves into nonparametric regression, penalized estimation (regularization), reproducing kernel Hilbert spaces, the kernel trick, and support vector machines. For Dependent Data it uses linear model theory to examine general linear models, linear mixed models, time series, spatial data, (generalized) multivariate linear models, discrimination, and dimension reduction. While numerous references to are made throughout the volume, can be used on its own given a solid background in linear models. Accompanying R code for the analyses is available online. Weitere Informationen: | | Author: | Ronald Christensen | Verlag: | Springer International Publishing | Sprache: | eng |
|
| | |
| | | |
| Weitere Suchbegriffe: ANOVA; Excel; FactorAnalysis; STATISTICA; timeseries; dataanalysis; MathematicalStatistics; Heteroscedasticity; mixedmodels; multivariatemodels, ANOVA, Excel, Factor analysis, STATISTICA, Time series, data analysis, mathematical statistics, heteroscedasticity, mixed models, multivariate models |
| | |
| |