Yep same :/Oh no I missed this.
Yep same :/
ComG orders charts go live on Sunday (/Monday morning in Asia). Since it is a big data point, predictions are closed right before.
Deadlines should be on mondays, I never check forums on weekend.
Isn't this one week earlier than usual btw?ComG orders charts go live on Sunday (/Monday morning in Asia). Since it is a big data point, predictions are closed right before.
We usually open the predictions early in the week tho.
I could have waited another week for the new releases (since Mario and Demon's Slayer didn't release yet) but since I included the monthly Switch hardware numbers, I needed to close the submissions now.Isn't this one week earlier than usual btw?
January: "you can make and edit your prediction on the following titles until January 9th 22:59 CEST [...]The tracked period will be January 3rd to January 30th."
February: "you can make and edit your prediction on the following titles until February 6th 22:59 CEST [...] The tracked period will be January 31st to February 27th. "
March: "you can make and edit your prediction on the following titles until March 6th 22:59 CEST [...]The tracked period will be February 28th to April 3rd (5 weeks)."
April: "you can make and edit your prediction on the following titles until April 10th 23:59 CEST [...] The tracked period will be April 4th to May 1st (4 weeks)."
June: "you can make and edit your prediction on the following titles until June 5th 23:59 CEST [...] The tracked period will be June 6th to July 3rd (4 weeks)."
Though I'm not complaining, I think it might be better how it is now.
Oh yeah that's right. Oh well, next month I'll try to check earlier.ComG orders charts go live on Sunday (/Monday morning in Asia). Since it is a big data point, predictions are closed right before.
We usually open the predictions early in the week tho.
Yeah that's a funny coincidence.Woah, I didn't know I predicted the exact same value as someone else for DS despite not being a nice looking number.
That's a good call, I'll see what I can doWith such % errors, I think ranking well / bad this month will have twice the impact of ranking well / bad in other months if it were to be used in an aggregate?
That's a good call, I'll see what I can do
The median prediction is always 1So for June, the top predictor, median predictor and lowest predictor score is:
38.9 / 49.9 = 0.78
49.9 / 49.9 = 0 (always 0)
96.6 / 49.9 = 1.94
OH @#$%!The median prediction is always 1
If it helps, I'm thinking of scoring relative to median.
Properties of your score:
-neutral to the difficulty of the month
-neutral to any outlier scoring (super lucky predictor who jackpots everything, or low effort predictor)
-gets more heavily rewarded / punished as an outlier (if a particular month is so easy such that the median is 8% error while you get 32%, you really screwed up badly that month and needs to be punished more, which the current aggregation method don't do so)
So for June, the top predictor, median predictor and lowest predictor score is:
38.9 / 49.9 = 0.78
49.9 / 49.9 = 0 (always 0)
96.6 / 49.9 = 1.94
Respectively for 2 other examples:
April = 0.74, 0, 3.28
March = 0.41, 0, 2.46
Aggregate ranking will just be the average of these scores, with lowest score ranking the highest and vice versa.
Individual months' rankings are completely unaffected.
I think I am just going to readjust the numbers by substracting the #1 predictor error margin to everyone else.The median prediction is always 1
I'd suggest the inverse of this (so 49.9 / 38.9 = 1.28 and 49.9 / 96.6 = 0.52), so that the No.1 predictor gets the highest score and the worst predictor the lowest score. Because forgetting to predict is beneficial if the goal is score minimisation instead of score maximisation.
I think I am just going to readjust the numbers by substracting the #1 predictor error margin to everyone else.
Simple but should work pretty well imho.
I'd like to thank the academy for this first place finish.