Multiple response sets often form the basis for further detailed questions about opinions on selected products.
The decision tree is a popular and effective algorithm used primarily in classification work, but it also serves well in predicting quantitative phenomena.
Regression trees are a very interesting data analysis technique commonly used in tasks related to poststratification, forecasting, and segmentation.
How can we estimate the potential value of consumer shopping, predict the behavior of a visitor to our website, or reduce the risk of credit losses?
In this post, we continue to discuss the visualisation of nested hierarchical data. In my previous blog, I showed how hierarchical data can be presented in a two-dimensional space...
In the previous PS blog “Trees that grow out of tables”, Janusz Wachnicki described how a good understanding of the humble crosstab can help us utilise classification trees more...
When stepping outside the domain of distributive and descriptive statistics for individual variables, we usually take interest in correlations between variables.