The Linear Additive Tree (LINAD) alternates between training a linear model and performing splits, each time fitting the gradient of the loss. The linear coefficients along each path from root to leaf are summed, giving a decision tree with linear models in the terminal nodes. This results in a single, fully interpretable and highly accurate model that rivals ensembles of traditional decision trees (CART), like random forest and gradient boosting.
At the same time, random forest or gradient boosting trained with LINAD base learners can lead to improved performance over CART ensembles while using considerably fewer trees (new vignette on this is upcoming).
06-30-24 10:57:48 Input contains more than one columns; will stratify on last [resample]
.:Resampling Parameters n.resamples: 10
resampler: strat.sub
stratify.var: y
train.p: 0.75
strat.n.bins: 4
06-30-24 10:57:48 Created 10 stratified subsamples [resample]
We are going to train elastic net (GLMNET), CART, random forest (RF), gradient boosting (GBM), and Linear Additive Tree (LINAD).
GLMNET is tuned for alpha and lambda.
CART is tuned by cost-complexity pruning.
GBM is tuned for number of trees.
LINAD is tuned for number of leaf nodes.
19.2.1 GLMNET
mod_glmnet <-s_GLMNET(dat_train, dat_test)
06-30-24 10:57:48 Hello, egenn [s_GLMNET]
.:Regression Input Summary
Training features: 374 x 12
Training outcome: 374 x 1
Testing features: 126 x 12
Testing outcome: 126 x 1
06-30-24 10:57:48 Running grid search... [gridSearchLearn]
.:Resampling Parameters n.resamples: 5
resampler: kfold
stratify.var: y
strat.n.bins: 4
06-30-24 10:57:48 Created 5 independent folds [resample]
.:Search parameters grid.params:
alpha: 0, 0.2, 0.4, 0.6, 0.8, 1...
fixed.params:
.gs: TRUE
which.cv.lambda: lambda.1se
06-30-24 10:57:48 Tuning Elastic Net by exhaustive grid search. [gridSearchLearn]
06-30-24 10:57:48 5 inner resamples; 30 models total; running on 8 workers (aarch64-apple-darwin20) [gridSearchLearn]
06-30-24 10:57:49 Extracting best lambda from GLMNET models... [gridSearchLearn]
.:Best parameters to minimize MSE best.tune:
lambda: 0.464712037007993
alpha: 0.8
06-30-24 10:57:49 Completed in 0.01 minutes (Real: 0.76; User: 0.10; System: 0.08) [gridSearchLearn]
.:Parameters alpha: 0.8
lambda: 0.464712037007993
06-30-24 10:57:49 Training elastic net model... [s_GLMNET]
.:GLMNET Regression Training Summary
MSE = 2.85 (40.15%)
RMSE = 1.69 (22.64%)
MAE = 1.29 (24.44%)
r = 0.68 (p = 1.8e-52)
R sq = 0.40.:GLMNET Regression Testing Summary
MSE = 2.63 (39.79%)
RMSE = 1.62 (22.41%)
MAE = 1.23 (23.05%)
r = 0.67 (p = 5.8e-18)
R sq = 0.4006-30-24 10:57:49 Completed in 0.01 minutes (Real: 0.83; User: 0.16; System: 0.09) [s_GLMNET]
mod_glmnet$plotVarImp()
As expected, GLMNET only captures the linear features.
06-30-24 10:57:49 Hello, egenn [s_CART]
.:Regression Input Summary
Training features: 374 x 12
Training outcome: 374 x 1
Testing features: 126 x 12
Testing outcome: 126 x 1
06-30-24 10:57:49 Running grid search... [gridSearchLearn]
.:Resampling Parameters n.resamples: 5
resampler: kfold
stratify.var: y
strat.n.bins: 4
06-30-24 10:57:49 Created 5 independent folds [resample]
.:Search parameters grid.params:
maxdepth: 20
minsplit: 2
minbucket: 1
cp: 0
prune.cp: 0, 0.001, 0.01, 0.1
fixed.params:
method: anova
model: TRUE
maxcompete: 0
maxsurrogate: 0
usesurrogate: 2
surrogatestyle: 0
xval: 0
cost: 1, 1, 1, 1, 1, 1...
ifw: TRUE
ifw.type: 2
upsample: FALSE
downsample: FALSE
resample.seed: NULL
06-30-24 10:57:49 Tuning Classification and Regression Trees by exhaustive grid search. [gridSearchLearn]
06-30-24 10:57:49 5 inner resamples; 20 models total; running on 8 workers (aarch64-apple-darwin20) [gridSearchLearn]
.:Best parameters to minimize MSE best.tune:
maxdepth: 20
minsplit: 2
minbucket: 1
cp: 0
prune.cp: 0.01
06-30-24 10:57:49 Completed in 3.4e-03 minutes (Real: 0.20; User: 0.05; System: 0.06) [gridSearchLearn]
06-30-24 10:57:49 Training CART... [s_CART]
.:CART Regression Training Summary
MSE = 1.26 (73.48%)
RMSE = 1.12 (48.51%)
MAE = 0.89 (47.88%)
r = 0.86 (p = 2.9e-109)
R sq = 0.73.:CART Regression Testing Summary
MSE = 2.26 (48.33%)
RMSE = 1.50 (28.12%)
MAE = 1.18 (26.43%)
r = 0.70 (p = 4.2e-20)
R sq = 0.4906-30-24 10:57:49 Completed in 3.7e-03 minutes (Real: 0.22; User: 0.07; System: 0.07) [s_CART]
mod_cart$plotVarImp()
dplot3_cart(mod_cart)
06-30-24 10:57:49 Object is rtemis rpart model [dplot3_cart]
19.2.3 RF
mod_rf <-s_Ranger(dat_train, dat_test,mtry =12)
06-30-24 10:57:50 Hello, egenn [s_Ranger]
.:Regression Input Summary
Training features: 374 x 12
Training outcome: 374 x 1
Testing features: 126 x 12
Testing outcome: 126 x 1
.:Parameters n.trees: 1000
mtry: 12
06-30-24 10:57:50 Training Random Forest (ranger) Regression with 1000 trees... [s_Ranger]
.:Ranger Regression Training Summary
MSE = 0.23 (95.17%)
RMSE = 0.48 (78.02%)
MAE = 0.37 (78.54%)
r = 0.98 (p = 2.4e-275)
R sq = 0.95.:Ranger Regression Testing Summary
MSE = 1.45 (66.75%)
RMSE = 1.21 (42.34%)
MAE = 0.98 (39.02%)
r = 0.84 (p = 2.6e-34)
R sq = 0.6706-30-24 10:57:50 Completed in 2.6e-03 minutes (Real: 0.16; User: 0.90; System: 0.03) [s_Ranger]
mod_rf$plotVarImp()
19.2.4 GBM
mod_gbm <-s_GBM(dat_train, dat_test)
06-30-24 10:57:50 Hello, egenn [s_GBM]
.:Regression Input Summary
Training features: 374 x 12
Training outcome: 374 x 1
Testing features: 126 x 12
Testing outcome: 126 x 1
06-30-24 10:57:50 Distribution set to gaussian [s_GBM]
06-30-24 10:57:50 Running Gradient Boosting Regression with a gaussian loss function [s_GBM]
06-30-24 10:57:50 Running grid search... [gridSearchLearn]
.:Resampling Parameters n.resamples: 5
resampler: kfold
stratify.var: y
strat.n.bins: 4
06-30-24 10:57:50 Created 5 independent folds [resample]
.:Search parameters grid.params:
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
fixed.params:
n.trees: 2000
max.trees: 5000
gbm.select.smooth: FALSE
n.new.trees: 500
min.trees: 50
failsafe.trees: 500
ifw: TRUE
ifw.type: 2
upsample: FALSE
downsample: FALSE
resample.seed: NULL
relInf: FALSE
plot.tune.error: FALSE
.gs: TRUE
06-30-24 10:57:50 Tuning Gradient Boosting Machine by exhaustive grid search. [gridSearchLearn]
06-30-24 10:57:50 5 inner resamples; 5 models total; running on 8 workers (aarch64-apple-darwin20) [gridSearchLearn]
06-30-24 10:57:50 Running grid line #1 of 5... [...future.FUN]
06-30-24 10:57:50 Hello, egenn [s_GBM]
.:Regression Input Summary
Training features: 298 x 12
Training outcome: 298 x 1
Testing features: 76 x 12
Testing outcome: 76 x 1
06-30-24 10:57:50 Distribution set to gaussian [s_GBM]
06-30-24 10:57:50 Running Gradient Boosting Regression with a gaussian loss function [s_GBM]
.:Parameters n.trees: 2000
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
weights: NULL
.:GBM Regression Training Summary
MSE = 0.47 (89.77%)
RMSE = 0.69 (68.01%)
MAE = 0.54 (68.39%)
r = 0.95 (p = 2.2e-150)
R sq = 0.9006-30-24 10:57:50 Using predict for Regression with type = link [s_GBM]
.:GBM Regression Testing Summary
MSE = 1.48 (72.14%)
RMSE = 1.22 (47.22%)
MAE = 0.93 (47.43%)
r = 0.85 (p = 2.6e-22)
R sq = 0.7206-30-24 10:57:50 Completed in 3.1e-03 minutes (Real: 0.18; User: 0.15; System: 0.03) [s_GBM]
06-30-24 10:57:50 Running grid line #2 of 5... [...future.FUN]
06-30-24 10:57:50 Hello, egenn [s_GBM]
.:Regression Input Summary
Training features: 301 x 12
Training outcome: 301 x 1
Testing features: 73 x 12
Testing outcome: 73 x 1
06-30-24 10:57:50 Distribution set to gaussian [s_GBM]
06-30-24 10:57:50 Running Gradient Boosting Regression with a gaussian loss function [s_GBM]
.:Parameters n.trees: 2000
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
weights: NULL
.:GBM Regression Training Summary
MSE = 0.62 (87.24%)
RMSE = 0.79 (64.28%)
MAE = 0.61 (64.72%)
r = 0.94 (p = 4.1e-141)
R sq = 0.8706-30-24 10:57:50 Using predict for Regression with type = link [s_GBM]
.:GBM Regression Testing Summary
MSE = 1.31 (68.97%)
RMSE = 1.14 (44.29%)
MAE = 0.91 (45.45%)
r = 0.85 (p = 1.2e-21)
R sq = 0.6906-30-24 10:57:50 Completed in 3e-03 minutes (Real: 0.18; User: 0.15; System: 0.03) [s_GBM]
06-30-24 10:57:50 Running grid line #3 of 5... [...future.FUN]
06-30-24 10:57:50 Hello, egenn [s_GBM]
.:Regression Input Summary
Training features: 298 x 12
Training outcome: 298 x 1
Testing features: 76 x 12
Testing outcome: 76 x 1
06-30-24 10:57:50 Distribution set to gaussian [s_GBM]
06-30-24 10:57:50 Running Gradient Boosting Regression with a gaussian loss function [s_GBM]
.:Parameters n.trees: 2000
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
weights: NULL
.:GBM Regression Training Summary
MSE = 0.49 (89.45%)
RMSE = 0.70 (67.51%)
MAE = 0.54 (67.56%)
r = 0.95 (p = 1.6e-149)
R sq = 0.8906-30-24 10:57:50 Using predict for Regression with type = link [s_GBM]
.:GBM Regression Testing Summary
MSE = 1.38 (74.25%)
RMSE = 1.17 (49.26%)
MAE = 0.92 (49.36%)
r = 0.87 (p = 1.8e-24)
R sq = 0.7406-30-24 10:57:50 Completed in 3e-03 minutes (Real: 0.18; User: 0.15; System: 0.03) [s_GBM]
06-30-24 10:57:50 Running grid line #4 of 5... [...future.FUN]
06-30-24 10:57:50 Hello, egenn [s_GBM]
.:Regression Input Summary
Training features: 301 x 12
Training outcome: 301 x 1
Testing features: 73 x 12
Testing outcome: 73 x 1
06-30-24 10:57:50 Distribution set to gaussian [s_GBM]
06-30-24 10:57:50 Running Gradient Boosting Regression with a gaussian loss function [s_GBM]
.:Parameters n.trees: 2000
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
weights: NULL
.:GBM Regression Training Summary
MSE = 0.66 (86.75%)
RMSE = 0.81 (63.61%)
MAE = 0.64 (63.57%)
r = 0.94 (p = 1.2e-138)
R sq = 0.8706-30-24 10:57:50 Using predict for Regression with type = link [s_GBM]
.:GBM Regression Testing Summary
MSE = 1.01 (73.89%)
RMSE = 1.00 (48.91%)
MAE = 0.78 (49.73%)
r = 0.86 (p = 1.1e-22)
R sq = 0.7406-30-24 10:57:50 Completed in 3e-03 minutes (Real: 0.18; User: 0.15; System: 0.03) [s_GBM]
06-30-24 10:57:50 Running grid line #5 of 5... [...future.FUN]
06-30-24 10:57:50 Hello, egenn [s_GBM]
.:Regression Input Summary
Training features: 298 x 12
Training outcome: 298 x 1
Testing features: 76 x 12
Testing outcome: 76 x 1
06-30-24 10:57:50 Distribution set to gaussian [s_GBM]
06-30-24 10:57:50 Running Gradient Boosting Regression with a gaussian loss function [s_GBM]
.:Parameters n.trees: 2000
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
weights: NULL
.:GBM Regression Training Summary
MSE = 0.71 (85.02%)
RMSE = 0.84 (61.29%)
MAE = 0.66 (61.17%)
r = 0.93 (p = 2.3e-130)
R sq = 0.8506-30-24 10:57:50 Using predict for Regression with type = link [s_GBM]
.:GBM Regression Testing Summary
MSE = 1.07 (78.25%)
RMSE = 1.04 (53.37%)
MAE = 0.80 (53.94%)
r = 0.89 (p = 1.3e-26)
R sq = 0.7806-30-24 10:57:50 Completed in 2.6e-03 minutes (Real: 0.16; User: 0.14; System: 0.01) [s_GBM]
.:Best parameters to minimize MSE best.tune:
n.trees: 990
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
06-30-24 10:57:50 Completed in 4.8e-03 minutes (Real: 0.29; User: 0.04; System: 0.04) [gridSearchLearn]
.:Parameters n.trees: 990
interaction.depth: 2
shrinkage: 0.01
bag.fraction: 0.9
n.minobsinnode: 5
weights: NULL
06-30-24 10:57:50 Training GBM on full training set... [s_GBM]
.:GBM Regression Training Summary
MSE = 0.62 (86.94%)
RMSE = 0.79 (63.86%)
MAE = 0.61 (64.06%)
r = 0.94 (p = 2.8e-170)
R sq = 0.8706-30-24 10:57:50 Calculating relative influence of variables... [s_GBM]
06-30-24 10:57:50 Using predict for Regression with type = link [s_GBM]
.:GBM Regression Testing Summary
MSE = 1.27 (70.99%)
RMSE = 1.13 (46.14%)
MAE = 0.90 (44.15%)
r = 0.86 (p = 1.1e-37)
R sq = 0.7106-30-24 10:57:50 Completed in 0.01 minutes (Real: 0.42; User: 0.17; System: 0.05) [s_GBM]
mod_gbm$plotVarImp()
19.2.5 LINAD
mod_linad <-s_LINAD(dat_train, dat_test)
06-30-24 10:57:50 Hello, egenn [s_LINAD]
.:Regression Input Summary
Training features: 374 x 12
Training outcome: 374 x 1
Testing features: 126 x 12
Testing outcome: 126 x 1
.:Parameters max.leaves: 20
learning.rate: 0.5
gamma: 0.5
lin.type: glmnet
nvmax: 3
alpha: 1
lambda: 0.05
minobsinnode.lin: 10
part.minsplit: 2
part.minbucket: 1
part.cp: 0
06-30-24 10:57:50 Training first Linear Model... [linadleaves]
06-30-24 10:57:50 Working on node id #1... [linadleaves]
06-30-24 10:57:50 Working on node id #2... [linadleaves]
06-30-24 10:57:50 Working on node id #3... [linadleaves]
06-30-24 10:57:50 Working on node id #4... [linadleaves]
06-30-24 10:57:50 Working on node id #5... [linadleaves]
06-30-24 10:57:50 Working on node id #10... [linadleaves]
06-30-24 10:57:50 Working on node id #11... [linadleaves]
06-30-24 10:57:50 Working on node id #20... [linadleaves]
06-30-24 10:57:50 Working on node id #21... [linadleaves]
06-30-24 10:57:50 Working on node id #8... [linadleaves]
06-30-24 10:57:50 Working on node id #9... [linadleaves]
06-30-24 10:57:50 Working on node id #22... [linadleaves]
06-30-24 10:57:50 Working on node id #23... [linadleaves]
06-30-24 10:57:50 Working on node id #40... [linadleaves]
06-30-24 10:57:50 Working on node id #41... [linadleaves]
06-30-24 10:57:50 Working on node id #16... [linadleaves]
06-30-24 10:57:50 Working on node id #17... [linadleaves]
06-30-24 10:57:50 Working on node id #44... [linadleaves]
06-30-24 10:57:50 Working on node id #45... [linadleaves]
06-30-24 10:57:50 Working on node id #42... [linadleaves]
06-30-24 10:57:50 Working on node id #43... [linadleaves]
06-30-24 10:57:50 Working on node id #80... [linadleaves]
06-30-24 10:57:50 Working on node id #81... [linadleaves]
06-30-24 10:57:50 Working on node id #32... [linadleaves]
06-30-24 10:57:50 Working on node id #33... [linadleaves]
06-30-24 10:57:50 Working on node id #84... [linadleaves]
06-30-24 10:57:50 Working on node id #85... [linadleaves]
06-30-24 10:57:50 Working on node id #160... [linadleaves]
06-30-24 10:57:50 Working on node id #161... [linadleaves]
06-30-24 10:57:50 Working on node id #64... [linadleaves]
06-30-24 10:57:50 Working on node id #65... [linadleaves]
06-30-24 10:57:50 Working on node id #168... [linadleaves]
06-30-24 10:57:50 Working on node id #169... [linadleaves]
06-30-24 10:57:51 Working on node id #320... [linadleaves]
06-30-24 10:57:51 Working on node id #321... [linadleaves]
06-30-24 10:57:51 Working on node id #128... [linadleaves]
06-30-24 10:57:51 Working on node id #129... [linadleaves]
06-30-24 10:57:51 Selected 10 leaves of 20 total function (..., bold = TRUE)
{
paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;117m", paste(...), "\033[0m")
} [selectleaves]
06-30-24 10:57:50 Training first Linear Model... [linadleaves]
06-30-24 10:57:50 Working on node id #1... [linadleaves]
06-30-24 10:57:50 Working on node id #2... [linadleaves]
06-30-24 10:57:50 Working on node id #3... [linadleaves]
06-30-24 10:57:50 Working on node id #4... [linadleaves]
06-30-24 10:57:50 Working on node id #5... [linadleaves]
06-30-24 10:57:50 Working on node id #10... [linadleaves]
06-30-24 10:57:50 Working on node id #11... [linadleaves]
06-30-24 10:57:50 Working on node id #20... [linadleaves]
06-30-24 10:57:50 Working on node id #21... [linadleaves]
06-30-24 10:57:50 Working on node id #42... [linadleaves]
06-30-24 10:57:50 Working on node id #43... [linadleaves]
06-30-24 10:57:50 Working on node id #8... [linadleaves]
06-30-24 10:57:50 Working on node id #9... [linadleaves]
06-30-24 10:57:50 Working on node id #40... [linadleaves]
06-30-24 10:57:50 Working on node id #41... [linadleaves]
06-30-24 10:57:50 Working on node id #16... [linadleaves]
06-30-24 10:57:50 Working on node id #17... [linadleaves]
06-30-24 10:57:50 Working on node id #86... [linadleaves]
06-30-24 10:57:50 Working on node id #87... [linadleaves]
06-30-24 10:57:50 Working on node id #84... [linadleaves]
06-30-24 10:57:50 Working on node id #85... [linadleaves]
06-30-24 10:57:50 Working on node id #80... [linadleaves]
06-30-24 10:57:50 Working on node id #81... [linadleaves]
06-30-24 10:57:50 Working on node id #32... [linadleaves]
06-30-24 10:57:50 Working on node id #33... [linadleaves]
06-30-24 10:57:50 Working on node id #172... [linadleaves]
06-30-24 10:57:50 Working on node id #173... [linadleaves]
06-30-24 10:57:50 Working on node id #168... [linadleaves]
06-30-24 10:57:50 Working on node id #169... [linadleaves]
06-30-24 10:57:50 Working on node id #160... [linadleaves]
06-30-24 10:57:50 Working on node id #161... [linadleaves]
06-30-24 10:57:51 Working on node id #336... [linadleaves]
06-30-24 10:57:51 Working on node id #337... [linadleaves]
06-30-24 10:57:51 Working on node id #64... [linadleaves]
06-30-24 10:57:51 Working on node id #65... [linadleaves]
06-30-24 10:57:51 Working on node id #344... [linadleaves]
06-30-24 10:57:51 Working on node id #345... [linadleaves]
06-30-24 10:57:51 Selected 3 leaves of 20 total function (..., bold = TRUE)
{
paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;117m", paste(...), "\033[0m")
} [selectleaves]
06-30-24 10:57:50 Training first Linear Model... [linadleaves]
06-30-24 10:57:50 Working on node id #1... [linadleaves]
06-30-24 10:57:50 Working on node id #2... [linadleaves]
06-30-24 10:57:50 Working on node id #3... [linadleaves]
06-30-24 10:57:50 Working on node id #4... [linadleaves]
06-30-24 10:57:50 Working on node id #5... [linadleaves]
06-30-24 10:57:50 Working on node id #10... [linadleaves]
06-30-24 10:57:50 Working on node id #11... [linadleaves]
06-30-24 10:57:50 Working on node id #20... [linadleaves]
06-30-24 10:57:50 Working on node id #21... [linadleaves]
06-30-24 10:57:50 Working on node id #22... [linadleaves]
06-30-24 10:57:50 Working on node id #23... [linadleaves]
06-30-24 10:57:50 Working on node id #8... [linadleaves]
06-30-24 10:57:50 Working on node id #9... [linadleaves]
06-30-24 10:57:50 Working on node id #16... [linadleaves]
06-30-24 10:57:50 Working on node id #17... [linadleaves]
06-30-24 10:57:50 Working on node id #40... [linadleaves]
06-30-24 10:57:50 Working on node id #41... [linadleaves]
06-30-24 10:57:50 Working on node id #32... [linadleaves]
06-30-24 10:57:50 Working on node id #33... [linadleaves]
06-30-24 10:57:50 Working on node id #42... [linadleaves]
06-30-24 10:57:50 Working on node id #43... [linadleaves]
06-30-24 10:57:50 Working on node id #80... [linadleaves]
06-30-24 10:57:50 Working on node id #81... [linadleaves]
06-30-24 10:57:50 Working on node id #64... [linadleaves]
06-30-24 10:57:50 Working on node id #65... [linadleaves]
06-30-24 10:57:50 Working on node id #160... [linadleaves]
06-30-24 10:57:50 Working on node id #161... [linadleaves]
06-30-24 10:57:50 Working on node id #84... [linadleaves]
06-30-24 10:57:50 Working on node id #85... [linadleaves]
06-30-24 10:57:50 Working on node id #128... [linadleaves]
06-30-24 10:57:50 Working on node id #129... [linadleaves]
06-30-24 10:57:51 Working on node id #168... [linadleaves]
06-30-24 10:57:51 Working on node id #169... [linadleaves]
06-30-24 10:57:51 Working on node id #320... [linadleaves]
06-30-24 10:57:51 Working on node id #321... [linadleaves]
06-30-24 10:57:51 Working on node id #256... [linadleaves]
06-30-24 10:57:51 Working on node id #257... [linadleaves]
06-30-24 10:57:51 Selected 10 leaves of 20 total function (..., bold = TRUE)
{
paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;117m", paste(...), "\033[0m")
} [selectleaves]
06-30-24 10:57:50 Training first Linear Model... [linadleaves]
06-30-24 10:57:50 Working on node id #1... [linadleaves]
06-30-24 10:57:50 Working on node id #2... [linadleaves]
06-30-24 10:57:50 Working on node id #3... [linadleaves]
06-30-24 10:57:50 Working on node id #4... [linadleaves]
06-30-24 10:57:50 Working on node id #5... [linadleaves]
06-30-24 10:57:50 Working on node id #10... [linadleaves]
06-30-24 10:57:50 Working on node id #11... [linadleaves]
06-30-24 10:57:50 Working on node id #20... [linadleaves]
06-30-24 10:57:50 Working on node id #21... [linadleaves]
06-30-24 10:57:50 Working on node id #42... [linadleaves]
06-30-24 10:57:50 Working on node id #43... [linadleaves]
06-30-24 10:57:50 Working on node id #8... [linadleaves]
06-30-24 10:57:50 Working on node id #9... [linadleaves]
06-30-24 10:57:50 Working on node id #40... [linadleaves]
06-30-24 10:57:50 Working on node id #41... [linadleaves]
06-30-24 10:57:50 Working on node id #16... [linadleaves]
06-30-24 10:57:50 Working on node id #17... [linadleaves]
06-30-24 10:57:50 Working on node id #86... [linadleaves]
06-30-24 10:57:50 Working on node id #87... [linadleaves]
06-30-24 10:57:50 Working on node id #80... [linadleaves]
06-30-24 10:57:50 Working on node id #81... [linadleaves]
06-30-24 10:57:50 Working on node id #84... [linadleaves]
06-30-24 10:57:50 Working on node id #85... [linadleaves]
06-30-24 10:57:50 Working on node id #172... [linadleaves]
06-30-24 10:57:50 Working on node id #173... [linadleaves]
06-30-24 10:57:50 Working on node id #32... [linadleaves]
06-30-24 10:57:50 Working on node id #33... [linadleaves]
06-30-24 10:57:50 Working on node id #160... [linadleaves]
06-30-24 10:57:50 Working on node id #161... [linadleaves]
06-30-24 10:57:50 Working on node id #168... [linadleaves]
06-30-24 10:57:51 Working on node id #169... [linadleaves]
06-30-24 10:57:51 Working on node id #344... [linadleaves]
06-30-24 10:57:51 Working on node id #345... [linadleaves]
06-30-24 10:57:51 Working on node id #64... [linadleaves]
06-30-24 10:57:51 Working on node id #65... [linadleaves]
06-30-24 10:57:51 Working on node id #320... [linadleaves]
06-30-24 10:57:51 Working on node id #321... [linadleaves]
06-30-24 10:57:51 Selected 11 leaves of 20 total function (..., bold = TRUE)
{
paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;117m", paste(...), "\033[0m")
} [selectleaves]
06-30-24 10:57:50 Training first Linear Model... [linadleaves]
06-30-24 10:57:50 Working on node id #1... [linadleaves]
06-30-24 10:57:50 Working on node id #2... [linadleaves]
06-30-24 10:57:50 Working on node id #3... [linadleaves]
06-30-24 10:57:50 Working on node id #6... [linadleaves]
06-30-24 10:57:50 Working on node id #7... [linadleaves]
06-30-24 10:57:50 Working on node id #12... [linadleaves]
06-30-24 10:57:50 Working on node id #13... [linadleaves]
06-30-24 10:57:50 Working on node id #26... [linadleaves]
06-30-24 10:57:50 Working on node id #27... [linadleaves]
06-30-24 10:57:50 Working on node id #52... [linadleaves]
06-30-24 10:57:50 Working on node id #53... [linadleaves]
06-30-24 10:57:50 Working on node id #4... [linadleaves]
06-30-24 10:57:50 Working on node id #5... [linadleaves]
06-30-24 10:57:50 Working on node id #8... [linadleaves]
06-30-24 10:57:50 Working on node id #9... [linadleaves]
06-30-24 10:57:50 Working on node id #16... [linadleaves]
06-30-24 10:57:50 Working on node id #17... [linadleaves]
06-30-24 10:57:50 Working on node id #32... [linadleaves]
06-30-24 10:57:50 Working on node id #33... [linadleaves]
06-30-24 10:57:50 Working on node id #64... [linadleaves]
06-30-24 10:57:50 Working on node id #65... [linadleaves]
06-30-24 10:57:50 Working on node id #128... [linadleaves]
06-30-24 10:57:50 Working on node id #129... [linadleaves]
06-30-24 10:57:50 Working on node id #256... [linadleaves]
06-30-24 10:57:50 Working on node id #257... [linadleaves]
06-30-24 10:57:50 Working on node id #512... [linadleaves]
06-30-24 10:57:50 Working on node id #513... [linadleaves]
06-30-24 10:57:50 Working on node id #1024... [linadleaves]
06-30-24 10:57:50 Working on node id #1025... [linadleaves]
06-30-24 10:57:50 Working on node id #2048... [linadleaves]
06-30-24 10:57:50 Working on node id #2049... [linadleaves]
06-30-24 10:57:50 Working on node id #4096... [linadleaves]
06-30-24 10:57:51 Working on node id #4097... [linadleaves]
06-30-24 10:57:51 Working on node id #8192... [linadleaves]
06-30-24 10:57:51 Working on node id #8193... [linadleaves]
06-30-24 10:57:51 Working on node id #16384... [linadleaves]
06-30-24 10:57:51 Working on node id #16385... [linadleaves]
06-30-24 10:57:51 Selected 4 leaves of 20 total function (..., bold = TRUE)
{
paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;117m", paste(...), "\033[0m")
} [selectleaves]
06-30-24 10:57:51 Training LINAD on full training set... [s_LINAD]
06-30-24 10:57:51 Training Linear Additive Tree Regression (max leaves = 8)... [linadleaves]
06-30-24 10:57:51 Training first Linear Model... [linadleaves]
06-30-24 10:57:51 Working on node id #1... [linadleaves]
06-30-24 10:57:51 Working on node id #2... [linadleaves]
06-30-24 10:57:51 Working on node id #3... [linadleaves]
06-30-24 10:57:51 Working on node id #6... [linadleaves]
06-30-24 10:57:51 Working on node id #7... [linadleaves]
06-30-24 10:57:51 Working on node id #12... [linadleaves]
06-30-24 10:57:51 Working on node id #13... [linadleaves]
06-30-24 10:57:51 Working on node id #26... [linadleaves]
06-30-24 10:57:51 Working on node id #27... [linadleaves]
06-30-24 10:57:51 Working on node id #52... [linadleaves]
06-30-24 10:57:51 Working on node id #53... [linadleaves]
06-30-24 10:57:51 Working on node id #4... [linadleaves]
06-30-24 10:57:51 Working on node id #5... [linadleaves]
06-30-24 10:57:51 Reached 8 leaves. [linadleaves]
.:Regression Training Summary
MSE = 0.82 (82.81%)
RMSE = 0.91 (58.54%)
MAE = 0.72 (57.82%)
r = 0.91 (p = 1.9e-146)
R sq = 0.83.:Regression Testing Summary
MSE = 1.24 (71.67%)
RMSE = 1.11 (46.78%)
MAE = 0.90 (44.00%)
r = 0.85 (p = 4.4e-37)
R sq = 0.7206-30-24 10:57:51 Completed in 0.02 minutes (Real: 1.01; User: 0.17; System: 0.06) [s_LINAD]
19.2.5.1 Plot LINAD
We can plot a LINAD using dplot3.linad.
This outputs an interactive decision tree.
If you place the mouse pointer over each node, you will see the linear coefficients at that node.
Notice how the sign and magnitude of the nonlinear feature changes from one leaf node to the other, as LINAD has partitioned the case space to find subgroups that are well described by separate linear models.