更新时间:2021-07-23 19:11:24
封面
Hands-On Ensemble Learning with R
Why subscribe?
PacktPub.com
Contributors
About the author
About the reviewer
Packt is Searching for Authors Like You
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Chapter 1. Introduction to Ensemble Techniques
Datasets
Hypothyroid
Waveform
German Credit
Iris
Pima Indians Diabetes
US Crime
Overseas visitors
Primary Biliary Cirrhosis
Multishapes
Board Stiffness
Statistical/machine learning models
Logistic regression model
Neural networks
Naïve Bayes classifier
Decision tree
Support vector machines
The right model dilemma!
An ensemble purview
Complementary statistical tests
Permutation test
Chi-square and McNemar test
ROC test
Summary
Chapter 2. Bootstrapping
Technical requirements
The jackknife technique
The jackknife method for mean and variance
Pseudovalues method for survival data
Bootstrap – a statistical method
The standard error of correlation coefficient
The parametric bootstrap
Eigen values
The boot package
Bootstrap and testing hypotheses
Bootstrapping regression models
Bootstrapping survival models*
Bootstrapping time series models*
Chapter 3. Bagging
Classification trees and pruning
Bagging
k-NN classifier
Analyzing waveform data
k-NN bagging
Chapter 4. Random Forests
Random Forests
Variable importance
Proximity plots
Random Forest nuances
Comparisons with bagging
Missing data imputation
Clustering with Random Forest
Chapter 5. The Bare Bones Boosting Algorithms
The general boosting algorithm
Adaptive boosting
Gradient boosting
Using the adabag and gbm packages
Comparing bagging random forests and boosting
Chapter 6. Boosting Refinements
Why does boosting work?
The gbm package
Boosting for count data
Boosting for survival data
The xgboost package
The h1o package
Chapter 7. The General Ensemble Technique
Why does ensembling work?
Ensembling by voting
Majority voting
Weighted voting
Ensembling by averaging