I have a NSIndexSet. Say {0, 12, 14}
I make a NSArray of the indexset [0, 12, 14]
I have a score calculation that I do currently based on
The @avg keypath on the array.
Which would be 8
I then make a Range (8-2, 5)
I enumarte the indexset
And if it’s in the Range
The I calculate my score based on the
Distance = avg (8) - currentIndex
Is this the best way to do this?
As I’m finding some indexset I’d say
{0, 12, 13, 14}
Get skewed a little.
The avg would be:8
Which would give index 12 a better score
When for me the best pick would be 13.
I found some stdDev code that didn’t produce any better results?
Should I look into enumuratingRanges if the index set?
Would that give me 2 ranges {(0,1) , (12,3)} ?
Then I could pick the range with the longest length
And then create my Avg value from that?
I’m curious what your overarching goal here is. Measures of central tendency and variance are what they are. Typically one uses the mean, median, or mode for calculating central tendency. Standard deviation is a measure of variance which doesn’t really apply; closest you’ll get to that for central tendency is (perhaps) the geometric mean, but geometric means are only meaningful if all terms are greater than zero.
Arithmetic means do tend to get skewed by outliers. If that’s a problem for you, use the median instead, which is either the value of the point in the middle of the list (if it’s an odd count) or the average of the two center values (if it’s an even count). In this case, the median would be 12.5, which is closer to your intuition. But that’s simply a modeling choice: do you want to ignore or incorporate outlying values…