# Blog

### Parametric versus non-parametric tests. Which test to choose for analysis?

Statistical analysis is an integral part of scientific research and working with data. In order to draw valid conclusions, the use of appropriate statistical tests is essential. The analyst is often faced with the choice of which test to choose in a given situation. This is important because the wr…

### Meta-analysis as an analytical tool

In today's scientific and research world, analysts are often confronted with the problem of analysing large amounts of data coming from different studies. In such situations, meta-analysis becomes an indispensable tool. It allows the results of many studies to be assessed collectively and more prec…

### General linear models and generalised linear models - differences and similarities

In data analysis, the use of general linear models is common due to their simplicity and ease of interpretation of the results obtained. However, there are times when the analyst encounters situations where the assumptions of classical linear models are difficult or impossible to meet. This may be …

### Bayesian inference

Bayesian inference is a method of statistical inference. It is named after Thomas Bayes, the British mathematician and pastor who first formulated Bayesian probability theory in the 18th century. It is a method of data analysis that allows the probability of certain events to be determined not only…

### Data gaps in quantitative data analysis - what are they and how to deal with them?

Missing data in the context of data analysis refers to situations where there are no values for certain variables or observations in a dataset. In other words, they are places where a number, text, or some other form of data was expected, but for various reasons was not there. Missing data can take…

### Population pyramid

When looking for the best way to visualise the data you have, you will come across an impressively wide range of different types of charts - from simple, basic ones such as a scatter plot to very advanced ones such as a Sankey diagram. Some, however, are designed with a specific type of data in min…

### The three sigma rule

The three sigma rule is an important tool in statistics and quality management. In the context of data analysis, it allows the identification of outlier points that are significantly different from the rest of the data. The use of the three-sigma rule in quality control also allows anomalies to be …

### Segmentation: from grouping to classification

Segmentation is a key process in data analysis, dividing a data set into relatively homogeneous groups based on specific criteria. The purpose of segmentation is to identify hidden patterns, differences and similarities between objects in a dataset, enabling more precise and relevant analyses. Two …

### Recoding quantitative variables into qualitative ones – techniques and their practical application

When analysing the data, we take into account both quantitative information (such as salary, age, number of products ordered) and qualitative information (e.g. gender, education, level of satisfaction with service). In order to make it easier to work with the data or to adapt it to a specific stati…

Accessibility settings
Line height
Letter spacing
No animations