It’s a story as old as time, a community-based web platform is created and rules are set for it – but they are not equally enforced and eventually a disparity forms between one side and another. Now, before the pitchforks come up at my neck, I’m not accusing anyone of anything but rather suggesting that an inevitable trait of community-driven sites that rely on the content of other sites to end up with loose sets of rules that are ripe for imbalance.
The imbalance this article aims to point out is not necessarily anyone’s fault, and we condemn any obnoxious negativity attached with the community’s reactions to this report. Let’s try and instead offer suggestions on how to open up the process and system behind Metacritic so that it may reform itself to be more clear and concise for the game industry it serves.
This has been a long time coming Metacritic staff, so don’t expect people to forget the powerful impact your tool has had on the industry (not always for the better) priming gamers to be quite reactive to such news of imbalance in your systems. Hopefully articles like this can provide the feedback needed to repair the somewhat broken scoring system in place at your website.
The Evil Within – 80 Metascore
*Based on 55 professional publications’ reviews.
The Evil Within – 76 Metascore
*Based on 21 professional publications’ reviews.
Problems discovered with this disparity upon first glance: Metacritic lists Game Informer’s The Evil Within Review in the Xbox ONE section even though a PS4 logo is clearly seen at the top of the Review with the bottom displaying “filed under” including “playstation 4” but not Xbox ONE. Sure, the Game Informer staff discloses that the game is also available on Xbox ONE among other platforms – but where exactly between Game Informer posting this Review and Metacritic putting it into their Metascore formula for The Evil Within did it become decided that Game Informer’s score counts for the Xbox ONE version and the PlayStation 4 version equally? This is something that needs to become more transparent and is only one small example. If a website can just upload one review and have their scores applied to platforms they never played it on, just how many reviews exactly and from what type of publications should we expect on each side before we call it fair?
IGN also has a Review that Lucy O’Brien wrote which displays “Reviewed on PlayStation 4 and Xbox One” rather than identifying with one console over the other. IGN’s reviews typically represent all platforms with a single score, unless there is a major issue. Metacritic chose to count this 8.7 as an 87 in the Xbox ONE side of things AND the PS4 side, making the exact formula Metacritic is using a bit unclear.
AusGamers has a 10/10 review which adds a 100 into the Xbox ONE version’s column, a website which reportedly (according to Metacritic) scores 60% higher than the average publication’s review consistently. Let’s not forget that the figurehead of AusGamers is married to EA’s Sydney, Australia PR Coordinator and that this very same website gave Simcity a 9.4/10 among some of the most extreme criticism in gaming history paired with customer service fails which even EA gave in and admitted fault to on numerous occasion. (read on about the AusGamers / EA connection on Reddit here).
It also depends on how many review copies the Publishers give out for each platform, for example the X1 version of Call of Duty: Advanced Warfare has a score of 84 with 33 reviews so far and PS4 version has 83 with 22, so how accurate can Metacritic really be in the grand scheme of things without a more transparent and universal scoring system? Is it because the game is better on the Xbox ONE? We may never know unless Metacritic makes their system more balanced and transparent.
More to come soon on this… until then, feel free to do your own research and continue checking and double-checking big journo corps for corrupt / inaccurate reporting. Real Gamers Get Their News Real, Keep it Real, Keep it Gaming, Peace.