In the past few years, it seems as if the unveiling of new phones has been accompanied by a new name in specification bragging rights. If you follow smartphone announcements or if you have seen any number of introductory headlines in passing, there is a fair chance you have heard the name DxOMark.

Smartphone makers often tout a numerical score as the best ever given by DxOMark, but what do these numbers mean? For the sake of this article, we will talk specifically about their process for testing mobile cameras.

Real metrics, subjective weight

For their mobile scores, DxOMark generates an overall score from a variety of sub-scores. These sub-scores for still photos consist of exposure and contrast, color, autofocus, texture, noise, artifacts, flash, zoom, and bokeh. For video, the sub-scores are exposure and contrast, color, autofocus, texture, noise, artifacts, and stabilization.

Most of these sub-scores are self explanatory, but some may require a small part of explanation for the uninformed. The artifacts sub-score includes softness, distortion, vignetting, chromatic aberration, ringing, flare, ghosting, aliasing, and more.

You may ask, what do all of these sub-scores mean? Well, DxOMark tests mobile cameras in their default mode (not RAW) both objectively and subjectively, or, as DxOMark puts it, perceptual evaluation. Various photos are taken in a testing lab which includes what DxOMark calls test targets, lighting systems, light-boxes, light meters, telemeters, spectrometers, and more. Additionally, a total of over fifty tests are completed using “real-life” indoor and outdoor scenes.

While we do not have the full details on how the results of these tests are translated into numerical scores, there are a few important points to note about these scores. As you can see below, the total score is clearly not an average of the sub-scores. This is because DxOMark weights these sub-scores according to what they believe to be the most important factors in camera quality. The issue with this is that while the folks at DxOMark are certainly experts in this field, there is a certain level of subjectivity involved that must be noted and considered.

It is also important to note that DxOMark offers to work with camera manufactures in an effort to help better their image quality, for a fee. This does not necessarily mean corruption is inevitable, but it must be considered that these manufacturers may be more inclined to boost their DxOMark score than to improve image quality according to their own preferences.

This is not necessarily a bad thing. The folks at DxOMark are professionals, and their expertise and experience should not be ignored, but image quality consists of so many variables that the variety of possible personal opinions and preferences are innumerable. More simply put, there are bad photos and there are excellent photos, but these is no such thing as a perfect photo.


All in all, DxOMark scores are a generally helpful tool to have in monitoring the quality of smartphone cameras. However, I urge you to take these scores with a grain of salt and remember that while a great deal of image quality can be accounted for quantitatively, there are also multiple qualitative factors that must be considered as well.

If you feel insulted or disappointed that your favorite phone scored lower than another similar phone, remember that these scores are not set in stone, and that your personal preference is more valuable than a score decided by imperfect procedures.

So, next time you see a high DxOMark score for the next hot new phone, remember that while that camera is almost certainly an excellent camera, a high score is no guarantee of superlative camera quality.

What is your favorite smartphone camera? Let us know in the comments below.

  • AOGV

    This company will work with manufactures for a fee yeah okay what a joke. score doesn’t mean anything.

    • Jurassic

      In situations where test results have been manipulated or influenced by a product manufacturer’s payment to the testing facility and the manufacturer providing its own controlled sample to the testing facility… Yes, you are right, that “score doesn’t mean anything”.

  • Jurassic

    I would take the DxOMark’s review of the Pixel 2 with a grain of salt.

    To be fully objective, reviews of technology products SHOULD be based on a random purchase of the product (after it goes on sale to the public). It should NOT be a review on a particular pre-release sample provided by the manufacturer, who obviously has control over the sample they provide for review.

    Last year, DxOMark reviewed the Pixel 1, claiming it to be the best smartphone camera at the time. But DxOMark’s review of the Pixel 1 was (coincidentally) published on the very same day that Google introduced the new Pixel 1 phones, and well before the products were available to the public.

    This year (just like last year), DxOMark reviewed the Pixel 2, claiming it again to be the best smartphone camera (with 99% “perfect” rating for photography). But DxOMark’s review of the Pixel 2 was also (implausibly coincidental) published AGAIN on the very same day that Google introduced the new Pixel 2 phones, and again well before they were available to the public.

    This doubly “coincidental” timing on the same day as Google’s product introductions BOTH years, is very dubious. It indicates that the release of the two reviews was almost certainly done in coordination with Google (the manufacturer of the product that DxOMark reviewed), meaning that the review was NOT a completely objective one.

    And more seriously, the fact that the review was completed well before Google’s product introduction, in order to release the reviews on the very same day as the product introductions (two years in a row!), means that Google obviously provided the pre-release samples of their own choosing (and in Google’s control) to DxOMark both years.

    These facts make DxOMark’s reviews, and that organization’s ethics, very questionable.