# CHAID v ranger v xgboost – a comparison -- July 27, 2018

In an earlier post, I focused
on an in depth visit with CHAID (Chi-square automatic interaction
detection). Quoting myself, I said “As the name implies it is
fundamentally based on the venerable Chi-square test – and while not the
most powerful (in terms of detecting the smallest possible differences)
or the fastest, it really is easy to manage and more importantly to tell
the story after using it”. In this post I’ll spend a little time
comparing CHAID with a random forest algorithm in the `ranger`

library
and with a gradient boosting algorithm via the `xgboost`

library. I’ll
use the exact same data set for all three so we can draw some easy
comparisons about their speed and their accuracy.