avgcrtckr wrote:Dear fellow Critickers; If you are wondering: The 10 most average critickers (whatever that means) so far, among those who have voted at least 166 films (10% in common with avgcrtckr): Frink anime salve* ZayanK TonythePony cxemu789 DirktheJerk dmarques understamp captainjazz Nroo
I also created anime salve before the averages utility and was surprised with the result (the similarity)
If you wonder how you (we) compare with imdb: imdb: TCI: 2.3761 (not in my 1000 TCIs) imdb-byvotes: TCI: 1.3814 (my 412nd TCI)
filmaffinity (another one that I had created long ago) seems closer than imdb, with a TCI of 1.1763 (64th best TCI)
It looks as if we are a different bunch that makes all the fun.
movieboy wrote:If you have just 10 distinct ratings, i.e. 1, 2, 3 .... 10, I think it should automatically fall into 10 different tiers.
No, this is wrong.
As a simple example, consider if someone ranked 10 films whose average tier is 7.0, with half being 7.5 or higher and the other half being 7.4 or below.
Well, in that case, a score of 7.0 would be either Tier 5 or Tier 6, NOT Tier 7. So yes, as djross correctly pointed out, in order for avgcrtcker to work, he would have to calculate what a Tier of 7.7 would mean for THEIR rankings, not simple translate it to a 77/100.
About tiers: It is the strength and the weakness of the Criticker. I think the guys should revise their calculation methods. We do not always vote in neatly classifiable tiers (equally distributed votings?!) (Note for the criticker managers: have you ever thought of revising your statistical courses? a small hint: normal distribution or Gaussian distribution could help - but not solve the equation -)
May the cinema survive May you watch the best of movies.
(About avgcrtcr tiers: they obviously do not match the average of all criticker users; I am sure you should have understood by now why not)
avgcrtckr wrote:About tiers: It is the strength and the weakness of the Criticker. I think the guys should revise their calculation methods. We do not always vote in neatly classifiable tiers (equally distributed votings?!) (Note for the criticker managers: have you ever thought of revising your statistical courses? a small hint: normal distribution or Gaussian distribution could help - but not solve the equation -)
May the cinema survive May you watch the best of movies.
(About avgcrtcr tiers: they obviously do not match the average of all criticker users; I am sure you should have understood by now why not)
(A note about ratings: statistically speaking voting out of 10 or 100 would not change anything)
TCI: 1.5476 Films in Common: 913 Your 10th Best TCI
Oh no, I'm so predictable! Now. Meh, I may be average but you know... In the criticker... Criticker consists of intellectual moviegoers. So, that doesn't sound bad! Whatever!
movieboy wrote:He should be ranking 7.75 as 8 instead of 78. All films should be ranked as 1, 2, 3 .... 10. So the film will automatically fall into the right tier.
Surely that will still not be "automatic" but on the contrary depend on the distribution of scores. The way to get it to work out "right" would be to control the distribution to match the average tier, i.e., for user avgcrtckr to rank the same number of films for each unit difference of average tier. But this would most likely not be possible for any large number of rankings, unless films with very few rankings are used, because there just are not enough films with a tier 10 average ranking.
The only way to do it would be to rank all of the "Very Obscure" films as either 100 or 0, and even then it probably wouldn't be worthwhile. Even though The Godfather averages around a tier 8.8, it's in more user's tier 10s than almost any other movie. I don't think avgcrtckr should artificially make it into a rounded tier 9 film.
Yeah, I'd really love just being able to define which of my ratings fall into which tier. Doing it automatically is a useful default, but the way it's set up now makes me feel like I'll never get things quite right unless I watch a lot of terrible movies...