Image for Q test

Q test

The Q test is a method used to identify and potentially remove outliers—data points that are unusually high or low compared to the rest of the data set. It calculates a ratio based on the differences between the suspected outlier and its closest neighbor, relative to the overall range of the data. If this ratio exceeds a critical value (which depends on the number of data points and desired confidence level), the point is likely an outlier. This helps ensure that analysis reflects typical data rather than anomalies that might skew results.