• Akira Toriyama passed away

    Let's all commemorate together his legendary work and his impact here

Media Create Prediction League : June 2022 [FULL RESULTS !! + OVERALL RANKING]

[NSW] Mario Strikers (3 days) - 71.111
[NSW] Demon Slayer (4 days) - 86.666
[NSW] Fire Emblem Warriors: Three Hopes (3 days) - 79.999
[NSW] Monster Hunter Rise + Sunbreak Set (4 days) - 182.222
[NSW] Nintendo Switch hardware (4 weeks) - 265.555
 
[NSW] Mario Strikers (3 days) - 75,000
[NSW] Demon Slayer (4 days) -96,000
[NSW] Fire Emblem Warriors: Three Hopes (3 days) - 90,000
[NSW] Monster Hunter Rise + Sunbreak Set (4 days) - 227,000
[NSW] Nintendo Switch hardware (4 weeks) - 248,000
 
[NSW] Mario Strikers (3 days) - 74,000
[NSW] Demon Slayer (4 days) - 93,000
[NSW] Fire Emblem Warriors: Three Hopes (3 days) - 94,000
[NSW] Monster Hunter Rise + Sunbreak Set (4 days) - 220,000
[NSW] Nintendo Switch hardware (4 weeks) - 263,000
 
[NSW] Mario Strikers (3 days) - 75,215
[NSW] Demon's Slayer (4 days) - 90,060
[NSW] Fire Emblem Warriors: Three Hopes (4 days) - 86,100
[NSW] Monster Hunter Rise + Sunbreak Set (4 days) - 174,983
[NSW] Nintendo Switch hardware (4 weeks) - 277,030
 
[NSW] Mario Strikers (3 days) - 85000
[NSW] Demon's Slayer (4 days) - 80500
[NSW] Fire Emblem Warriors: Three Hopes (4 days) - 120000
[NSW] Monster Hunter Rise + Sunbreak Set (4 days) - 210000
[NSW] Nintendo Switch hardware (4 weeks) - 290000
 
[NSW] Mario Strikers (3 days) - 73k
[NSW] Demon's Slayer (4 days) - 67k
[NSW] Fire Emblem Warriors: Three Hopes (4 days) - 115k
[NSW] Monster Hunter Rise + Sunbreak Set (4 days) - 185k
[NSW] Nintendo Switch hardware (4 weeks) - 258k
 
[NSW] Mario Strikers (3 days) - 57.412
[NSW] Demon's Slayer (4 days) - 101.037
[NSW] Fire Emblem Warriors: Three Hopes (4 days) - 124.903
[NSW] Monster Hunter Rise + Sunbreak Set (4 days) - 200.500
[NSW] Nintendo Switch hardware (4 weeks) - 266.000
 
I forgot to predict. Why am I so stupid...

Anyway, here are the median predictions (min <> max):

[NSW] Mario Strikers: Battle League Football - 80.000 (57.412 <> 125.000)
[NSW] Demon Slayer: Kimetsu no Yaiba - The Hinokami Chronicles - 93.000 (55.000 <> 133.000)
[NSW] Fire Embem Warriors: Three Hopes - 96.500 (65.578 <> 135.000)
[NSW] Monster Hunter Rise + Sunbreak Set - 202.000 (150.000 <> 333.000)
[NSW] Nintendo Switch hardware (4 weeks) - 259.620 (229.000 <> 301.000)

The most bull-ish predictor of the month is @Nocturnal, expecting no less than 932.000 units sold across all predictions.
The most bear-ish predictor of the month is @kamaitachi, expecting only 624.000 units sold across all predictions.

I'm not sure if they did this on purpose, but @PillFencer's predictions sum to 666.666 units sold across all predictions.

@Grady's predictions are closest to the median (in terms of %).
@Kenka's predictions are farthest away from the median.
 
It was indeed on purpose ⛧ (just because I found that my initial total was already very close to it so why not)

So InstallBase expects:
- Same opening for MS Battle League and MG Super Rush
- Demon Slayer Switch matching the PS4 version at launch (or down 19% vs. PS4+PS5)
- FEW Three Hopes up 55% compared to the original (or +138% without the 3DS SKU)
- Again same opening for MH Rise + Sunbreak Set and MH World - Iceborne ME
- 3rd best June for Switch hardware (down 17% vs. 2021, down 7% vs. 2020, up 31% vs. 2019)
 
ComG orders charts go live on Sunday (/Monday morning in Asia). Since it is a big data point, predictions are closed right before.

We usually open the predictions early in the week tho.
Isn't this one week earlier than usual btw?

January: "you can make and edit your prediction on the following titles until January 9th 22:59 CEST [...]The tracked period will be January 3rd to January 30th."
February: "you can make and edit your prediction on the following titles until February 6th 22:59 CEST [...] The tracked period will be January 31st to February 27th. "
March: "you can make and edit your prediction on the following titles until March 6th 22:59 CEST [...]The tracked period will be February 28th to April 3rd (5 weeks)."
April: "you can make and edit your prediction on the following titles until April 10th 23:59 CEST [...] The tracked period will be April 4th to May 1st (4 weeks)."

June: "you can make and edit your prediction on the following titles until June 5th 23:59 CEST [...] The tracked period will be June 6th to July 3rd (4 weeks)."

Though I'm not complaining, I think it might be better how it is now.
 
Isn't this one week earlier than usual btw?

January: "you can make and edit your prediction on the following titles until January 9th 22:59 CEST [...]The tracked period will be January 3rd to January 30th."
February: "you can make and edit your prediction on the following titles until February 6th 22:59 CEST [...] The tracked period will be January 31st to February 27th. "
March: "you can make and edit your prediction on the following titles until March 6th 22:59 CEST [...]The tracked period will be February 28th to April 3rd (5 weeks)."
April: "you can make and edit your prediction on the following titles until April 10th 23:59 CEST [...] The tracked period will be April 4th to May 1st (4 weeks)."

June: "you can make and edit your prediction on the following titles until June 5th 23:59 CEST [...] The tracked period will be June 6th to July 3rd (4 weeks)."

Though I'm not complaining, I think it might be better how it is now.
I could have waited another week for the new releases (since Mario and Demon's Slayer didn't release yet) but since I included the monthly Switch hardware numbers, I needed to close the submissions now.
 
ComG orders charts go live on Sunday (/Monday morning in Asia). Since it is a big data point, predictions are closed right before.

We usually open the predictions early in the week tho.
Oh yeah that's right. Oh well, next month I'll try to check earlier.
 
Demon Slayer & Mario Strikers results
[NSW] Mario Strikers: Battle League Football (3 days)

unknown.png


After a long hiatus, the comeback of the Mario Strikers series was widely overpredicted by our members, around 50000 units too high. Expectations were in line with Mario Golf but it seems like the game performed more in line with past Strikers titles.

Congrats to @MarcoP90 who topped both the units and % error margin rankings with his 57 412 prediction, which is still a 78,4% margin of error. They were very closely followed by @Welfare and @PillFencer.

[NSW] Demon Slayer (4 days)

unknown.png



The late port of Demon Slayer on Switch raised several questions because of the original lacking performance in Japan. However, it became pretty clear that there was a big underserved audience that were only waiting for the game to be on Switch. As such the game fell very much in line with IB's expectations, only 2000 units below the IB consensus.

Big congrats to both @Deluxury and @VengefulMagus who topped both the units and % error margin rankings with their exceptional 90 060 prediction, only 825 units away from the target which is a 0,9% margin of error. They were very closely followed by @TJTree who had an excellent 1,0% error margin. We had 13 members below 5% of error margin, so big congrats !

Thanks again for the good turnout this month with around 45 members !
 
Gonna be my worst prediction month if OLED stock remains that bad, being 3rd on Mario Strikers won't help much 💀

Last time I remember a game being predicted so wrong by every member was maybe Call of Duty Vanguard?
 
FEW:3H results
[NSW] Fire Emblem Warriors: Three Hopes (4 days)

unknown.png


Almost 5 years after the original Fire Emblem Warriors, Three Hopes managed to outsell its predecessor handily, while matching our members expectations. Indeed, the Install Base consensus is merely 1000 units away from the true results. Big congrats to everybody ! :)

Congrats also to @YuriCastro and @Lelouch0612 who topped both the units and % error margin rankings with their excellent 96 500 prediction, same as the median, which is a very small 1,1% margin of error. They were very closely followed by @Chris1964 who was only 1,3% away.

In total, 13 members were below 5% so big congrats to them !
 
Sunbreak & Switch Hardware results
Monster Hunter Rise: Sunbreak Set

unknown.png


Well, no comment there. The closest were @Malden, @Blue Monty and @Tokuiten (in that order) but everyone was very far off the mark.

Nintendo Switch Hardware (4 weeks)

unknown.png


We have 5 very close predictors this time, thanks to their 250k estimate (so weekly sales of 62 500 units on average). Huge congrats to @Crjunior35, @Smelly, @Yeshua, @BTB123 and @Ghostsonplanets ! They were less than 400 units off (0,1% error margin)

In total, 24 (!!) members were below 5% so big congrats to them ! Seems like IB is better at predicting hardware than software :p
 
July will be an exciting month, don't forget to predict !
 
Wait what!

Maybe I should trust my interpretation of Amazon rankings more, I saw patterns for very low Sunbreak and Strikers numbers but I was just like "nah, it can't possibly go that low"
 
With such % errors, I think ranking well / bad this month will have twice the impact of ranking well / bad in other months if it were to be used in an aggregate?
 
That's a good call, I'll see what I can do ;)

If it helps, I'm thinking of scoring relative to median.
Properties of your score:
-neutral to the difficulty of the month
-neutral to any outlier scoring (super lucky predictor who jackpots everything, or low effort predictor)
-gets more heavily rewarded / punished as an outlier (if a particular month is so easy such that the median is 8% error while you get 32%, you really screwed up badly that month and needs to be punished more, which the current aggregation method don't do so)

So for June, the top predictor, median predictor and lowest predictor score is:
38.9 / 49.9 = 0.78
49.9 / 49.9 = 0 (always 0)
96.6 / 49.9 = 1.94

Respectively for 2 other examples:
April = 0.74, 0, 3.28
March = 0.41, 0, 2.46

Aggregate ranking will just be the average of these scores, with lowest score ranking the highest and vice versa.
Individual months' rankings are completely unaffected.
 
Last edited:
So for June, the top predictor, median predictor and lowest predictor score is:
38.9 / 49.9 = 0.78
49.9 / 49.9 = 0 (always 0)
96.6 / 49.9 = 1.94
The median prediction is always 1 ;)

I'd suggest the inverse of this (so 49.9 / 38.9 = 1.28 and 49.9 / 96.6 = 0.52), so that the No.1 predictor gets the highest score and the worst predictor the lowest score. Because forgetting to predict is beneficial if the goal is score minimisation instead of score maximisation.
 
If it helps, I'm thinking of scoring relative to median.
Properties of your score:
-neutral to the difficulty of the month
-neutral to any outlier scoring (super lucky predictor who jackpots everything, or low effort predictor)
-gets more heavily rewarded / punished as an outlier (if a particular month is so easy such that the median is 8% error while you get 32%, you really screwed up badly that month and needs to be punished more, which the current aggregation method don't do so)

So for June, the top predictor, median predictor and lowest predictor score is:
38.9 / 49.9 = 0.78
49.9 / 49.9 = 0 (always 0)
96.6 / 49.9 = 1.94

Respectively for 2 other examples:
April = 0.74, 0, 3.28
March = 0.41, 0, 2.46

Aggregate ranking will just be the average of these scores, with lowest score ranking the highest and vice versa.
Individual months' rankings are completely unaffected.

The median prediction is always 1 ;)

I'd suggest the inverse of this (so 49.9 / 38.9 = 1.28 and 49.9 / 96.6 = 0.52), so that the No.1 predictor gets the highest score and the worst predictor the lowest score. Because forgetting to predict is beneficial if the goal is score minimisation instead of score maximisation.
I think I am just going to readjust the numbers by substracting the #1 predictor error margin to everyone else.

Simple but should work pretty well imho.
 
I think I am just going to readjust the numbers by substracting the #1 predictor error margin to everyone else.

Simple but should work pretty well imho.

Yes I guess it should work well enough if the aggregate ranking continues as it is - exclude anyone who missed any month. The mind blowing part is only when we want to solve even more problems, such as somehow having an aggregate ranking that can fairly include people who missed any particular month. There's somehow no perfect solution for it.
 
Mid-year results
Mid-year results

As the first half of the year ended, it is time to publish the current overall ranking. As mentionned earlier, we took the top predictor of each month as the reference point. Meaning, the error margin showed below is the difference between your error margin and the #1 error margin.

Over the 5 months predicted, I took the results of the 4 best months. The "removed" month is in Red.

unknown.png

unknown.png



Big congrats to @PillFencer who is leading the chart. No wins but very well positionned in February, April and June.
@Deluxury is #2 thanks to a constantly good level of predictions. @DarkDetective completes the podium despite forgetting to predict in June.

The IB consensus is very well positionned too, with a sub-10% gap ;) 6 members below 10% is also great.

Congrats everyone and let's shake this chart in the following months, starting in July !

 
Made me chuckle when 86% of IB predictors are below the median AKA IB's consensus. I assume the factor lies in some people being unable to remove their worst month.

Compared to previous aggregate ranking, I thought I would be kicked out of the podium since removing the worst month don't seem to be a big deal to me.
 
Back
Top Bottom