Are DxOMark scores real?

Everyone probably heard of DxOMark it's an independent benchmark that scientifically assesses image quality on phones and cameras and you see companies bragging about their DxOMark scores all the time but it doesn't give the right results because something I've heard a lot of people saying is that DxO mark doesn't give the right scores because shooting photos in lab conditions does not translate to the real world and so to address this first DxOMark is actually really thorough they take their phones indoor-outdoor in the day and at night they test everything from zoom to bokeh and flash quality being detailed in their testing is not the issue here so what is the problem well we can break it down into 7 main things-

Number 7 Smartphone features are evolving really quickly and this test just isn't built for that take the pixel 3 they scored this phone at 101 DxOMark factored in the new wide-angle front camera better bokeh and refined zoom but there is more to it than that the pixel 3 hos top shot a really clever feature that allows you to fix people's faces after having taken a photo and because this doesn't come into their criteria because it doesn't affect noise or texture or exposure it isn't counted but does improve the overall camera experience yeah an even bigger crime is the exclusion of night sight mode such a powerful feature that it changes low-light photos from completely average to literally the best you can get to not include a feature like this but at the same time praise the Galaxy Note 9 for its really good low-light photography it kind of makes the pixel 3 review seem unrepresentative then we've got withholding if you look at the current league table you'll actually notice quite an odd result Huawei's p20 Pro is the current market leader and this is particularly strange because the company has released an improved version of it in the mate 20 Pro if you have a flick through you'll then realize that the reason for this is the May 20 Pro is nowhere to be found and the suppose if reason for this is an interesting one the mate 20 Pro scores too highly it seems like Huawei has made the decision to not bother competing with itself they all stay at the top of the leaderboards until some company overtakes the p20 Pro and then their plan is to release the mate 20 Pro score and just be like well we're still on top clever strategies aside the point here is that if this list can be adjusted by manufacturers who don't want the results to be public then it's inherently biased it means that you're not seeing a representative League table but in fact only a list of phones for which the manufacturers are happy with the results from. Then we've got the fact that the evaluation of cameras leads you into a lot of gray areas it's quite easy to look at two photos decide which one is more detailed and give that phone a higher number based on this but what if you're analyzing something more complex how do you assign a number to the fact that Huawei's night mode takes better shots but needs you to keep the phone still for four whole seconds versus one plus' one second how do you factor in that the mate 20 Pro has a pretty incredible light painting mode even though it's a niche feature that not everyone will use it's not entirely DXMark's fault but the very nature of trying to turn something as subjective as this into a single number is problematic and unlike with a lot of reviews that give a number at the end of their analysis just as a summary this number is the only thing most people are seeing for say for example I was picking my phone based mainly on its video quality looking at DxO marks cause I'd be pushed towards a p20 Pro which goes a hundred and nine whereas in actuality the iPhone eight takes better video and audio another nice pause 92 which leads me on to the next point that microphone quality isn't considered and given that when it comes to video audio is like 30 percent of the game that feels like quite a major exclusion okay now is where the big problems start to come in there is a conflict of interest DxO Mark's main business is the consultation of companies they come in and they advise them on how they can improve their cameras but the problem is that the very nature of this advice will be how to improve cameras in the ways that we think are important aka how to do well on our tests companies can also purchase the DxO analyzer a massive and expensive package of hardware software and training that allows them to essentially simulate some of the tests DxOMark uses to determine scores which is genuinely the equivalent of revising for a test when you already have the answers OEM's that purchase this can effectively build their products around scoring well in these tests which as you can imagine makes it massively unfair for those that don't and then this causes a ripple effect because all these companies are fighting each other to score the highest in this benchmark it could actually hinder the progress of smartphone cameras of course they will have to make the cameras better if they want to score higher in the tests but the focus of these improvements will be weighted towards the areas that DxOMark cares about and away from perhaps new and more exciting features the most important thing though and we've kind of alluded to this already is the fact that one number is not enough the actual analysis and the written text on the website is really thorough really well researched but by giving companies this one number that they can use to summarize everything you're effectively giving them a way to brag about their camera that gives very little indication to a consumer how good it might be for them so what I think would be more useful is for DxO mark to only issue subcategory scores so a company could tell people our device is one of the best at handling exposure or they could tell people our device has one of the highest rated zoom capabilities this would remove the subjectivity that comes into deciding an overall score to try and encompass everything into one number and what end up meaning that the numbers mean more to the end consumer like the analysis you'll find on the DxOMark website is some of the most thorough in the world but it's more the scoring system I think that's causing a lot of the problems that we're seeing you get weird results like the a 2018 LG g7 being only as good as the 2015 galaxy s6 edge or xiaomi's Mi Mix 2's beating the iphone 8 plus even though it's video and audio quality isn't even close. Stay tuned to CERadar for more such interesting updates.

Post a Comment

0 Comments