Wednesday, August 10, 2011
Grubbs test question?
Basically, G tells you how many standard deviations a particular point lies from the mean. If G > 2, that means that the point is more than 2 standard deviations from the mean (about 5% of the points for a data set would have G>2); similarly, if G=1, that point is 1 standard deviation from the mean. If G is large, then that point is an outlier (it is many standard deviations away from the mean)--often, points that are more than 3 standard deviations from the mean are considered outliers and are discarded because they are so different from the rest of the data (there would be only about a 0.3% chance of that these data points actually belong in the data set).
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment