Selfie Review – DXOMARK https://www.dxomark.com The leading source of independent audio, display, battery and image quality measurements and ratings for smartphone, camera, lens and wireless speaker since 2008. Wed, 14 Dec 2022 10:35:57 +0000 en-US hourly 1 https://wordpress.org/?v=5.6.8 https://www.dxomark.com/wp-content/uploads/2019/09/logo-o-transparent-150x150.png Selfie Review – DXOMARK https://www.dxomark.com 32 32 Huawei Mate 50 Pro Selfie test https://www.dxomark.com/huawei-mate-50-pro-selfie-test/ https://www.dxomark.com/huawei-mate-50-pro-selfie-test/#respond Wed, 14 Dec 2022 09:00:03 +0000 https://www.dxomark.com/?p=134205&preview=true&preview_id=134205 We put the Huawei Mate 50 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Huawei Mate 50 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Huawei Mate 50 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 13MP sensor
  • f/2.4-aperture lens
  • up to 4K video
  • 3D Depth Sensing Camera

Scoring

Sub-scores and attributes included in the calculations of the global score.


Huawei Mate 50 Pro
145
selfie
147
photo
92

Best

97

105

100

Best

75

79

94

Best

80

89

93

Best

65

80

143
video
82

86

85

90

90

92

75

97

72

83

74

92

77

82

Pros

  • Nice skin tones in photos
  • Good exposure and wide dynamic range in photos and videos
  • Good detail and very low noise levels
  • Subjects always in focus, even in group shots
  • Video noise well under control in bright light and indoors
  • Accurate white balance and nice colors
  • Wide depth of field in videos

Cons

  • Local loss of texture is visible in some conditions, especially in low-light images
  • Occasionally desaturated skin tones visible in backlit scenes
  • Artifacts such as color quantization and local loss of facial texture in photos
  • Coarse noise and lack of detail in low-light videos
  • Occasionally low contrast in videos
  • Slightly inaccurate skin tones in some conditions
  • Video artifacts, including local movement of texture, color quantization, and ghosting

With a DXOMARK Selfie score of 145, the Huawei Mate 50 Pro reaches the No. 1 spot in our front camera ranking. Except for the device’s fixed-focus front camera, the Mate 50 Pro comes with most of the same hardware specifications as the Huawei P50 Pro. Despite the hardware similarities, the Mate 50 Pro’s performance was an improvement over the P50 Pro thanks to new software solutions and refined tuning,

For video, the Mate 50 Pro was tested at 4K resolution at 30 frames per second, with Vivid HDR mode enabled. Vivid HDR is a new HDR video format and currently supported by a range of phones and TVs. In our tests the Mate 50 Pro delivered a very consistent still image performance, achieving top scores in several test categories, including exposure, focus, noise and flash. This said, overall the Huawei Mate 50 Pro video results were good but not outstanding, mainly due to slightly low contrast levels and a low light performance that left some room for improvement.

Accurate target exposure and wide dynamic range, nice colors, and good detail

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

[glossary_exclude]Huawei Mate 50 Pro Selfie Scores vs Ultra-Premium[/glossary_exclude]
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

[glossary_exclude]Photo[/glossary_exclude]

147

Huawei Mate 50 Pro

Best

[glossary_exclude][/glossary_exclude]
[glossary_exclude]Huawei Mate 50 Pro Photo scores vs Ultra-Premium[/glossary_exclude]
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. The range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

When shooting still images, the Huawei Mate 50 Pro delivered very solid results across almost all test categories. Target exposure was accurate, with a wider dynamic range than the competition. Images showed nice colors, but our testers observed a slight green cast in overcast conditions and slightly desaturated skin tones in scenes with strong backlighting. A wide depth of field meant all subjects tended to be in focus in group shots. In addition, the Huawei did very well in terms of texture/noise, rendering very fine detail nicely and limiting noise in most conditions. This said, texture rendering was occasionally somewhat unstable across consecutive shots, mainly in difficult low-light scenes.

[glossary_exclude]Exposure[/glossary_exclude]

92

Huawei Mate 50 Pro

Best

[glossary_exclude][/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

97

Huawei Mate 50 Pro

105

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Huawei Mate 50 Pro – wide dynamic range, slightly bright face exposure, desaturated skin tones
Apple iPhone 14 Pro – limited dynamic range, slightly dark face exposure, slightly inaccurate skin tones
Google Pixel 7 Pro – limited dynamic range, good face exposure, slightly inaccurate skin tones

[glossary_exclude]Focus[/glossary_exclude]

100

Huawei Mate 50 Pro

Best

[glossary_exclude][/glossary_exclude]

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Huawei Mate 50 Pro - group selfie
Huawei Mate 50 Pro - wide depth of field, background subjects in focus
Apple iPhone 14 Pro - group selfie
Apple iPhone 14 Pro - limited depth of field, background out of focus
Google Pixel 7 Pro - group selfie
Google Pixel 7 Pro - limited depth of field, background subjects out of focus

[glossary_exclude]Texture[/glossary_exclude]

75

Huawei Mate 50 Pro

79

[glossary_exclude]Asus ZenFone 7 Pro[/glossary_exclude]

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

[glossary_exclude]Noise[/glossary_exclude]

94

Huawei Mate 50 Pro

Best

[glossary_exclude][/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

[glossary_exclude]Visual noise evolution with illuminance levels in handheld condition[/glossary_exclude]
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

[glossary_exclude]Artifacts[/glossary_exclude]

80

Huawei Mate 50 Pro

89

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

[glossary_exclude]Main photo artifacts penalties[/glossary_exclude]

[glossary_exclude]Bokeh[/glossary_exclude]

65

Huawei Mate 50 Pro

80

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

Slightly inaccurate subject isolation, no blur gradient

[glossary_exclude]Video[/glossary_exclude]

143

Huawei Mate 50 Pro

154

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

[glossary_exclude]Huawei Mate 50 Pro Video scores vs Ultra-Premium[/glossary_exclude]
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

In video mode, the Mate 50 Pro front camera shone especially in terms of exposure, dynamic range and white balance. Overall, the device delivered a pleasant video experience but left some opportunities for improvements. Contrast was sometimes not perfectly adjusted and our testers noticed frequent exposure instabilities, especially in strongly backlit scenes. In such difficult conditions, we also noticed slightly inaccurate skin tone rendering. The level of captured detail was high on faces, and noise was kept well under control both in bright light and under indoor conditions. However, noise was quite noticeable in low light. Video stabilization was quite effective, but frame shifts were often visible when panning the camera.

[glossary_exclude]Exposure[/glossary_exclude]

82

Huawei Mate 50 Pro

86

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

85

Huawei Mate 50 Pro

90

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

 

Huawei Mate 50 Pro – wide dynamic range, exposure instabilities

Apple iPhone 14 Pro – wide dynamic range, nice contrast, stable exposure

Huawei P50 Pro – limited dynamic range, slightly inaccurate color

[glossary_exclude]Texture[/glossary_exclude]

75

Huawei Mate 50 Pro

97

[glossary_exclude]Asus ZenFone 6[/glossary_exclude]

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

[glossary_exclude]Noise[/glossary_exclude]

72

Huawei Mate 50 Pro

83

[glossary_exclude]Xiaomi Mi 11 Ultra[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

[glossary_exclude]Spatial visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
[glossary_exclude]Temporal visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

[glossary_exclude]Stabilization[/glossary_exclude]

77

Huawei Mate 50 Pro

82

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Huawei Mate 50 Pro – effective compensation of walking motion, frequent frame shift

Apple iPhone 14 Pro – effective compensation of walking motion

Huawei P50 Pro – effective compensation of walking motion

[glossary_exclude]Artifacts[/glossary_exclude]

74

Huawei Mate 50 Pro

92

[glossary_exclude]Apple iPhone 12 mini[/glossary_exclude]

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

[glossary_exclude]Main video artifacts penalties[/glossary_exclude]

The post Huawei Mate 50 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/huawei-mate-50-pro-selfie-test/feed/ 0 Best Best Best Best SELFIE SELFIE IenaBridge_HuaweiMate50Pro_DxOMark_Selfie Best Best BacklitDuofieFairDeep_HuaweiMate50Pro_DxOMark_Selfie BacklitDuofieFairDeep_AppleiPhone14Pro_DxOMark_Selfie BacklitDuofieFairDeep_GooglePixel7Pro_DxOMark_Selfie Best Best Street_HuaweiMate50Pro_DxOMark_Selfie
Google Pixel 7 Pro Selfie test https://www.dxomark.com/google-pixel-7-pro-selfie-test/ https://www.dxomark.com/google-pixel-7-pro-selfie-test/#respond Thu, 13 Oct 2022 09:02:22 +0000 https://www.dxomark.com/?p=127929&preview=true&preview_id=127929 We put the Google Pixel 7 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Google Pixel 7 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Google Pixel 7 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 10.8MP sensor with 1.22μm pixels
  • f/2.2 aperture lens
  • 92.8º Field of view
  • Fixed focus
  • 4K video at 30/60fps (4K at 30fps tested)

Scoring

Sub-scores and attributes included in the calculations of the global score.


Google Pixel 7 Pro
142
selfie
140
photo
92

Best

105

Best

89

100

60

79

81

94

89

Best

88

93

70

80

146
video
81

86

87

90

86

92

76

97

67

83

88

92

82

Best

Pros

  • Natural skin tones and nice white balance, even in difficult conditions
  • Generally accurate target exposure and wide dynamic range
  • Effective video stabilization
  • Fairly wide depth of field
  • Fairly low noise in bright light and indoor conditions

Cons

  • Slight loss of fine detail
  • Out-of-focus faces at close shooting distance
  • Image noise in low light

With a DXOMARK Selfie score of 142, the Google Pixel 7 Pro achieves a spot among the best in our front camera ranking. On the Pixel 7 Pro, Google uses the new second generation of its in-house Tensor chipset and a Samsung image sensor instead of the Sony units in previous models. However, the new sensor is very close to the old one in terms of size and pixel count. The rest of the front camera specification, including focal length and focus point, remains pretty much unchanged as well, but despite the very similar front camera hardware, the new model offers a slightly improved overall performance when compared to last year’s Pixel 6 Pro, thanks to better software and tuning.

Like previous Pixel devices, the 7 Pro performs particularly well in terms of skin tone rendering. Google’s “True tone” rendering is capable of producing natural skin tones in still images and videos across all skin types, including dark skin tones, which most other devices struggle with. The Pixel 7 Pro photos and video clips also show good exposure and a wide dynamic range, capturing good detail from the brightest to the darkest parts of the image. The camera also keeps unwanted image artifacts very well under control and is capable of creating a natural-looking bokeh effect in portrait mode.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

Google Pixel 7 Pro – overall accurate exposure and color
[glossary_exclude]Google Pixel 7 Pro Selfie Scores vs Ultra-Premium[/glossary_exclude]
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

[glossary_exclude]Photo[/glossary_exclude]

140

Google Pixel 7 Pro

147

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]
[glossary_exclude]Google Pixel 7 Pro Photo scores vs Ultra-Premium[/glossary_exclude]
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

The Pixel 7 Pro front camera really shines for color. Skin tone rendering is nice across different skin types, and white balance is natural and stable. The Google device also does very well for exposure. Target exposure tends to be accurate, with a wide dynamic range, but occasionally some instabilities can be noticeable. Our testers also noted a small number of underexposures among our thousands of test shots. Depth of field is pretty wide, providing good sharpness on subjects in almost all focus planes. Only faces very close to the lens (30cm or less) can be out of focus. Image noise is mostly well under control, but we observed some loss of fine detail. Image artifacts are well under control as well.

[glossary_exclude]Exposure[/glossary_exclude]

92

Google Pixel 7 Pro

Best

[glossary_exclude][/glossary_exclude]

Exposure is one of the key attributes for technically good pictures. The main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.

Target exposure is generally accurate, and the camera offers a wide dynamic range. However, slight tone mapping instabilities are noticeable on occasion.

Google Pixel 7 Pro – accurate face exposure, wide dynamic range
Apple iPhone 13 Pro – accurate face exposure, wide dynamic range
Huawei P50 Pro – accurate face exposure but limited dynamic range on background

In high-contrast scenes like this backlit selfie shot, target exposure can be slightly low.

Google Pixel 7 Pro – occasionally low target exposure in high-contrast scenes

[glossary_exclude]Color[/glossary_exclude]

105

Google Pixel 7 Pro

Best

[glossary_exclude][/glossary_exclude]

Color is one of the key attributes for technically good pictures. The image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Color is a strong point for the Google Pixel 7 Pro. It offers nice white balance and skin tones in most test conditions — across all types of skin tones. Color is also stable across a series of shots.

Google Pixel 7 Pro – accurate skin tones, natural white balance
Apple iPhone 13 Pro – accurate skin tones but white balance cast
Huawei P50 Pro – acceptable skin tones but slight desaturation

Even in challenging conditions, like the scene below, with an almost monochrome background, in low light and with high contrasts, the Pixel 7 Pro is capable of delivering accurate color. Skin tones on the  Pixel 7 Pro look natural. The iPhone, on the other hand, shows a warm cast that affects skin tone rendering. Skin tones on the Huawei are acceptable but look paler and less pleasant than on the comparison devices.

Google Pixel 7 Pro – natural skin tones
Apple iPhone 13 Pro – inaccurate skin tones due to strong white balance cast
Huawei P50 Pro – acceptable skin tones

[glossary_exclude]Focus[/glossary_exclude]

89

Google Pixel 7 Pro

100

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

Depth of field is similarly wide as on previous Google devices, providing decent sharpness across several focus planes.

Google Pixel 7 Pro
Google Pixel 7 Pro - fairly wide depth of field
Apple iPhone 13 Pro
Apple iPhone 13 Pro - fairly wide depth of field
Huawei P50 Pro
Huawei P50 Pro - slightly wider depth of field than comparison devices

However, in close-up shots (30cm or less) the subject’s face is out of focus.

Google Pixel 7 Pro – background is sharper than face in this close-up selfie

[glossary_exclude]Texture[/glossary_exclude]

60

Google Pixel 7 Pro

79

[glossary_exclude]Asus ZenFone 7 Pro[/glossary_exclude]

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

The Pixel 7 Pro produces acceptable texture in most test conditions, both in lab measurements and real-life scenes. However, compared to the iPhone 13 Pro and Huawei P50 Pro some fine detail is lost.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.
Google Pixel 7 Pro, detail
Google Pixel 7 Pro, fairly good detail but loss of fine detail
Google Pixel 6 Pro, detail
Google Pixel 6 Pro, slightly lower level of detail than comparison devices
Samsung Galaxy S22 Ultra (Exynos), detail
Samsung Galaxy S22 Ultra (Exynos), good detail

[glossary_exclude]Noise[/glossary_exclude]

81

Google Pixel 7 Pro

94

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Image noise is generally well under control in outdoor and indoor lighting. In low light, it becomes a little more intrusive. The same is true for the shadow areas in high-contrast scenes.

[glossary_exclude]Visual noise evolution with illuminance levels in handheld condition[/glossary_exclude]
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

 

Google Pixel 7 Pro, noise
Google Pixel 7 Pro, noise overall well controlled, but some luminance noise in the shadows
Apple iPhone 13 Pro, noise
Apple iPhone 13 Pro, noise in all areas of the image
Huawei P50 Pro, noise
Huawei P50 Pro, noise well under control

 

[glossary_exclude]Artifacts[/glossary_exclude]

89

Google Pixel 7 Pro

Best

[glossary_exclude][/glossary_exclude]

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

Artifacts are overall well controlled on the Pixel 7 Pro front camera. Our testers only observed a few color quantization artifacts, especially in low-light images.

[glossary_exclude]Main photo artifacts penalties[/glossary_exclude]

[glossary_exclude]Bokeh[/glossary_exclude]

70

Google Pixel 7 Pro

80

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

Compared to the predecessor Pixel 6 Pro, selfie bokeh is a major improvement, thanks to a new blur gradient effect which helps make the final result look more realistic. As a result, the bokeh score has increased from 65 to 70.

Google Pixel 7 Pro – accurate blur gradient

However, the Pixel 7 Pro is not quite yet on the level of the iPhone 13 Pro in terms of selfie bokeh mode. There is no difference in blur intensity between elements in the scene that are close to or far away from the lens. With objects in the scene at the same distance to the camera as the subject, depth estimation is also less accurate.

Google Pixel 7 Pro – inaccurate depth estimation with objects at same shooting distance as subject
Apple iPhone 13 Pro – accurate depth estimation
Huawei P50 Pro – inaccurate depth estimation with objects at same shooting distance as subject

[glossary_exclude]Video[/glossary_exclude]

146

Google Pixel 7 Pro

154

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

[glossary_exclude]Google Pixel 7 Pro Video scores vs Ultra-Premium[/glossary_exclude]
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

Like for still images, in video mode, the Pixel 7 Pro performs particularly well for exposure and color. Skin tones are rendered nicely and the camera produces good exposures with a wide dynamic range. We observed high levels of temporal noise but fine detail is well preserved in Pixel 7 Pro video clips. Our testers also noticed some oversharpening, resulting in unnatural texture rendering. Video stabilization is effective at counteracting camera motion but sharpness differences between frames are visible when walking while recording video.

[glossary_exclude]Exposure[/glossary_exclude]

81

Google Pixel 7 Pro

86

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed.

In video mode, the Pixel 7 Pro produces accurate exposure in most test conditions and a wide dynamic range ensures good highlight and shadow detail. Exposure is also very consistent, with hardly any instabilities.

Google Pixel 7 Pro – accurate target exposure and wide dynamic range

Apple iPhone 13 Pro – accurate target exposure and wide dynamic range

Huawei P50 Pro – accurate target exposure but slightly more limited dynamic range

[glossary_exclude]Color[/glossary_exclude]

87

Google Pixel 7 Pro

90

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Pixel 7 Pro selfie videos show natural skin tones and accurate white balance in most bright light and indoor scenes, even in difficult backlit scenes.

Google Pixel 7 Pro – mostly nice skin tones

Apple iPhone 13 Pro – mostly nice skin tones

Huawei P50 Pro – acceptable, but slightly washed out, skin tones

[glossary_exclude]Focus[/glossary_exclude]

86

Google Pixel 7 Pro

92

[glossary_exclude]Huawei Mate 40 Pro[/glossary_exclude]

A fairly wide depth of field ensures good sharpness across all faces in group shots.

Google Pixel 7 Pro – fairly wide depth of field

Apple iPhone 13 Pro – wide depth of field

Huawei P50 Pro – wide depth of field

[glossary_exclude]Texture[/glossary_exclude]

76

Google Pixel 7 Pro

97

[glossary_exclude]Asus ZenFone 6[/glossary_exclude]

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Our lab measurements show high levels of detail for the Pixel 7 Pro in all conditions. However, our testers occasionally also observed some unnatural texture rendering.

Google Pixel 7 Pro – high level of detail

Apple iPhone 13 Pro – high level of detail

Huawei P50 Pro – detail is well preserved
[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

[glossary_exclude]Noise[/glossary_exclude]

67

Google Pixel 7 Pro

83

[glossary_exclude]Xiaomi Mi 11 Ultra[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Temporal noise is often noticeable in the Pixel 7 Pro selfie clips, but noise levels have decreased when compared to last year’s Pixel 6 Pro, resulting in an improved texture/noise trade-off.

Google Pixel 7 Pro – temporal noise

Apple iPhone 13 Pro – temporal and spatial noise

Huawei P50 Pro – noise well under control
[glossary_exclude]Spatial visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
[glossary_exclude]Temporal visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

[glossary_exclude]Stabilization[/glossary_exclude]

82

Google Pixel 7 Pro

Best

[glossary_exclude][/glossary_exclude]

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jello artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

In terms of video stabilization, the Google Pixel 7 Pro uses a similar approach as the Pixel 6 Pro. Google’s Steadiface algorithm stabilizes the background of the clip, rather than the subject like on some other devices.

Google Pixel 7 Pro – background well stabilized, but sharpness differences between frames

Apple iPhone 13 Pro – more noticeable motion

Huawei P50 Pro – slightly more noticeable motion

[glossary_exclude]Artifacts[/glossary_exclude]

88

Google Pixel 7 Pro

92

[glossary_exclude]Apple iPhone 12 mini[/glossary_exclude]

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

Like for stills, the Pixel 7 Pro controls unwanted artifacts well in video mode. However, some penalty points were applied for frame rate changes, ringing and color quantization effects.

[glossary_exclude]Main video artifacts penalties[/glossary_exclude]

The post Google Pixel 7 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/google-pixel-7-pro-selfie-test/feed/ 0 Best Best Best Best SELFIE SELFIE ChaillotHerculeStatue_GooglePixel7Pro_DxOMark_Selfie Best EiffelTowerFromTrocadero_GooglePixel7Pro_DxOMark_Selfie EiffelTowerFromTrocadero_AppleiPhone13Pro_DXOMARK_Selfie EiffelTowerFromTrocadero_HuaweiP50Pro_DxOMark_Selfie_06-00 Amadeus120cm_GooglePixel7Pro_DxOMark_Selfie Best GroufieWildbirdOutdoor_GooglePixel7Pro_DxOMark_Selfie GroufieWildbirdOutdoor_AppleiPhone13Pro_DXOMARK_Selfie_06-00 GroufieWildbirdOutdoor_HuaweiP50Pro_DxOMark_Selfie_06-00 VegetationAsianFair_GooglePixel7Pro_DxOMark_Selfie VegetationAsianFair_AppleiPhone13Pro_DXOMARK_Selfie_06-00 VegetationAsianFair_HuaweiP50Pro_DxOMark_Selfie_06-00 BridgeHDR_GooglePixel7Pro_DxOMark_Selfie Best CafetWall_GooglePixel7Pro_DxOMark_Selfie CafetShrub_GooglePixel7Pro_DxOMark_Selfie CafetShrub_AppleiPhone13Pro_DXOMARK_Selfie_07-00 CafetShrub_HuaweiP50Pro_DxOMark_Selfie_06-00 Best
Apple iPhone 14 Pro Selfie test https://www.dxomark.com/apple-iphone-14-pro-selfie-test/ https://www.dxomark.com/apple-iphone-14-pro-selfie-test/#respond Fri, 30 Sep 2022 13:04:12 +0000 https://www.dxomark.com/?p=127267&preview=true&preview_id=127267 We put the Apple iPhone 14 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Apple iPhone 14 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Apple iPhone 14 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 12MP sensor
  • f/1.9-aperture lens
  • Autofocus
  • 4K video at 24/25/30/60 fps, 1080p at 25/30/60 fps (4K at 30 fps tested)

Scoring

Sub-scores and attributes included in the calculations of the global score.


Apple iPhone 14 Pro
145
selfie
139
photo
92

Best

90

105

96

100

76

79

69

94

84

89

77

93

80

Best

154
video
86

Best

90

Best

91

92

83

97

69

83

88

92

82

Best

Pros

  • Accurate target exposure for stills and video
  • Accurate autofocus and wide depth of field
  • High level of detail
  • Pleasant white balance and nice skin tones in bright light and indoor conditions
  • Natural foreground blur and well-rendered spotlights in bokeh shots

Cons

  • Noise in photo and video
  • Sharpness differences between frames often visible in walking videos
  • Occasionally inaccurate skin tones in challenging conditions, such as low light or high dynamic range scenes

With a DXOMARK Selfie score of 145, the Apple iPhone 14 Pro achieves a new top score in our front camera ranking, surpassing the Huawei P50 Pro by one point. This latest version of Apple´s TrueDepth camera now comes with an autofocus system and a faster aperture (f/1.9 compared to f/2.2 on the previous models). These hardware modifications have contributed to a drastic improvement of various sub-scores in both Photo and Video.

Video performance is specifically impressive. With a Video score of 154, the iPhone 14 Pro leads the front camera video ranking, thanks to new highs for Exposure, Color, and Stabilization. With a Photo score of 139, the 14 Pro doesn’t quite make it to the top spot here, but an increase of 11 points over the previous generation means a massive improvement, and the device delivers the best performance to date for Exposure and Bokeh. Overall, the Apple iPhone 14 Pro is an easy choice for any smartphone users who like to produce image and video content using the front camera.

Apple iPhone 14 Pro – excellent exposure and nice skin tones

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

[glossary_exclude]Apple iPhone 14 Pro Selfie Scores vs Ultra-Premium[/glossary_exclude]
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

[glossary_exclude]Photo[/glossary_exclude]

139

Apple iPhone 14 Pro

147

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]
[glossary_exclude]Apple iPhone 14 Pro Photo scores vs Ultra-Premium[/glossary_exclude]
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

For Photo, the iPhone 14 Pro has improved in all attributes compared to the iPhone 13 Pro. Results are better in both the lab and real-life scenes. It looks like the changes to the camera hardware of the new model have helped tackle some of the issues we saw on previous iPhone front cameras. For example, the addition of an autofocus system makes a big difference in terms of focus accuracy and depth of field. But the 14 Pro front camera’s still images have a lot more to offer, including accurate exposure, excellent detail, as well as nice colors in general and good skin tones. The simulated bokeh effect in portrait mode looks very natural, too. On the downside, some image noise is noticeable in most conditions, and in low light or high-contrast scenes, skin tone rendition can become less accurate.

[glossary_exclude]Exposure[/glossary_exclude]

92

Apple iPhone 14 Pro

Best

[glossary_exclude][/glossary_exclude]

Exposure is one of the key attributes for technically good pictures. The main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.

The iPhone 14 Pro’s front camera produces accurate exposures with a pretty wide dynamic range and nice contrast.

Apple iPhone 14 Pro – accurate face and background exposure, pleasant contrast
Google Pixel 6 Pro – accurate face exposure, slightly underexposed background
Samsung Galaxy S22 Ultra (Exynos) – accurate face exposure, very slightly underexposed background

Highlight clipping still occurs in very challenging scenes, such as this backlit selfie shot.

Apple iPhone 14 Pro – highlight clipping in challenging conditions

[glossary_exclude]Color[/glossary_exclude]

90

Apple iPhone 14 Pro

105

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

Color is one of the key attributes for technically good pictures. The image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

White balance and rendering of different types of skin tones are generally nice. In addition, white balance remains consistent across a series of consecutive shots in most test conditions.

Apple iPhone 14 Pro – nice rendering of a wide range of skin tones

On some occasions, our testers observed slightly inaccurate skin tones in backlit scenes and mixed lighting conditions. Sometimes there is also a warm color cast that has a negative impact on skin tone accuracy.

Apple iPhone 14 Pro – Slightly inaccurate skin tone rendering
Google Pixel 6 Pro – Slightly inaccurate skin tone rendering
Samsung Galaxy S22 Ultra (Exynos) – acceptable skin tone rendering
Apple iPhone 14 Pro – warm cast affects skin tone rendering
Google Pixel 6 Pro – natural skin tone rendering but slight cold cast
Samsung Galaxy S22 Ultra (Exynos) – accurate skin tone rendering and white balance

[glossary_exclude]Focus[/glossary_exclude]

96

Apple iPhone 14 Pro

100

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

The iPhone 14 series is the first iPhone generation to incorporate an autofocus system into the front camera. This allows the device to optimize the focus point for the scene and ensure good sharpness on all faces in group selfie shots. It also helps to increase background detail when shooting at longer subject distances, for example with a selfie stick.

Apple iPhone 14 Pro
Apple iPhone 14 Pro - Good depth of field
Google Pixel 6 Pro
Google Pixel 6 Pro - Depth of field is slightly narrower compared to reference devices
Samsung Galaxy S22 Ultra (Exynos)
Samsung Galaxy S22 Ultra (Exynos) - Depth of field is extended

[glossary_exclude]Texture[/glossary_exclude]

76

Apple iPhone 14 Pro

79

[glossary_exclude]Asus ZenFone 7 Pro[/glossary_exclude]

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

The iPhone 14 front camera captures high texture levels in most test conditions. This can be seen in objective lab measurements as well as in real-life scenes.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.
Apple iPhone 14 Pro - detail
Apple iPhone 14 Pro - fairly high level of detail
Google Pixel 6 Pro - detail
Google Pixel 6 Pro - slight loss of detail compared to competitors
Samsung Galaxy S22 Ultra (Exynos) - detail
Samsung Galaxy S22 Ultra (Exynos) - good detail

[glossary_exclude]Noise[/glossary_exclude]

69

Apple iPhone 14 Pro

94

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

[glossary_exclude]Visual noise evolution with illuminance levels in handheld condition[/glossary_exclude]
This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the Close-up Dead Leaves setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Image noise can be observed in most test conditions, as illustrated below. The Google Pixel 6 Pro does a better job at noise reduction. The Samsung Galaxy S22 Ultra (Exynos) tends to display the highest noise levels among the comparison devices.

Apple iPhone 14 Pro - noise
Apple iPhone 14 Pro - luminance noise
Google Pixel 6 Pro - noise
Google Pixel 6 Pro - noise well under control
Samsung Galaxy S22 Ultra (Exynos) - noise
Samsung Galaxy S22 Ultra (Exynos) - slightly coarse and chromatic noise

[glossary_exclude]Artifacts[/glossary_exclude]

84

Apple iPhone 14 Pro

89

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

[glossary_exclude]Main photo artifacts penalties[/glossary_exclude]

Some image artifacts, such as halo, hue shift, and ringing, can be found in the 14 Pro’s front camera images, especially when capturing challenging HDR  scenes.

Apple iPhone 14 Pro – halo and ringing artifacts, slight hue shift

[glossary_exclude]Bokeh[/glossary_exclude]

80

Apple iPhone 14 Pro

Best

[glossary_exclude][/glossary_exclude]

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

Selfie bokeh simulation was already a strong point of the previous iPhone generation, but the 14 Pro takes things to the next level and is now the best device for selfie bokeh that we have tested to date. Blur gradient is smooth, depth estimation is accurate, and images show a natural bokeh shape. Thanks to the addition of foreground blur, bokeh rendering is now even more natural, earning the phone a top score of 80 for this test attribute.

Apple iPhone 14 Pro – accurate blur gradient, including on the foreground

[glossary_exclude]Video[/glossary_exclude]

154

Apple iPhone 14 Pro

Best

[glossary_exclude][/glossary_exclude]
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

[glossary_exclude]Apple iPhone 14 Pro Video scores vs Ultra-Premium[/glossary_exclude]
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

In video mode, exposure, color (white balance), texture, and autofocus are the iPhone 14 Pro’s main strengths. Combined, these strong points allow the Apple device to push front camera video quality to unknown heights. The result is a new state of the art experience for selfie video, and a top score. The video mode is not perfect, though. There is still some room for improvement. For example, noise is noticeable in most test conditions, and skin tones look a little off in backlit indoor scenes. Video stabilization is overall very effective, but some sharpness differences between frames are noticeable when walking while recording video.

The iPhone 14 Pro front camera video mode was tested at 4K resolution, 30 frames per second, and with the Dolby Vision format activated.

[glossary_exclude]Exposure[/glossary_exclude]

86

Apple iPhone 14 Pro

Best

[glossary_exclude][/glossary_exclude]

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed.

Video exposure is accurate in most test conditions and dynamic range is pretty wide. As a bonus, we barely saw any exposure instabilities.

Apple iPhone 14 Pro – accurate face exposure throughout entire clip, wide dynamic range

Apple iPhone 13 Pro Max – accurate face exposure throughout entire clip, wide dynamic range

Google Pixel 6 Pro – accurate face exposure throughout entire clip, wide dynamic range

[glossary_exclude]Color[/glossary_exclude]

90

Apple iPhone 14 Pro

Best

[glossary_exclude][/glossary_exclude]

Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

The iPhone 14 Pro video generally comes with natural skin tone rendering and accurate white balance in most bright light and indoor scenes. However, skin tone accuracy can slightly suffer in high-contrast scenes.

Apple iPhone 14 Pro – nice skin tones

Apple iPhone 13 Pro Max – nice skin tones

Google Pixel 6 Pro – nice skin tones

[glossary_exclude]Focus[/glossary_exclude]

91

Apple iPhone 14 Pro

92

[glossary_exclude]Huawei Mate 40 Pro[/glossary_exclude]

The 14 Pro’s new autofocus system lifts iPhone video to a new level. For example, it adds the ability to optimize the focus point for the scene, practically widening the depth of field. This is useful in group selfies where ideally all subjects should have good sharpness, from the closest to those furthest away from the camera.

Apple iPhone 14 Pro – wide depth of field

Apple iPhone 13 Pro Max – slightly narrower depth of field

Google Pixel 6 Pro – wide depth of field

[glossary_exclude]Texture[/glossary_exclude]

83

Apple iPhone 14 Pro

97

[glossary_exclude]Asus ZenFone 6[/glossary_exclude]

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

As the chart shows, the iPhone 14 Pro front camera is capable of capturing high levels of detail in video at all tested light conditions.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

[glossary_exclude]Noise[/glossary_exclude]

69

Apple iPhone 14 Pro

83

[glossary_exclude]Xiaomi Mi 11 Ultra[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

Noise is sometimes visible under indoor and low light conditions in 14 Pro front camera video clips, especially in the corners of the frame.

[glossary_exclude]Spatial visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
[glossary_exclude]Temporal visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

[glossary_exclude]Stabilization[/glossary_exclude]

82

Apple iPhone 14 Pro

Best

[glossary_exclude][/glossary_exclude]

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jello artifacts, during walk and panning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

For the 14 Pro, Apple has changed its approach to video stabilization. On the iPhone 13 Pro Max, the subject’s face was stabilized. On the new model, the background is stabilized instead. Both are valid methods. Generally, the 14 Pro compensates well for motion, even when walking while recording. However, some sharpness differences between frames are noticeable.

Apple iPhone 14 Pro – background stabilized

Apple iPhone 13 Pro Max – noticeable motion, especially on the background

Google Pixel 6 Pro – background and hand shake stabilized

[glossary_exclude]Artifacts[/glossary_exclude]

88

Apple iPhone 14 Pro

92

[glossary_exclude]Apple iPhone 12 mini[/glossary_exclude]

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

[glossary_exclude]Main video artifacts penalties[/glossary_exclude]

The post Apple iPhone 14 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/apple-iphone-14-pro-selfie-test/feed/ 0 Best Best Best Best Best SELFIE SELFIE RoofTop_AppleiPhone14Pro_DxOMark_Selfie Best ChaillotHerculeStatue_AppleiPhone14Pro_DxOMark_Selfie_05-00 ChaillotHerculeStatue_GooglePixel6Pro_DxOMark_Selfie_05-00 ChaillotHerculeStatue_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 DancingWindow_AppleiPhone14Pro_DxOMark_Selfie_05-00 GroufieWildbirdOutdoor_AppleiPhone14Pro_DxOMark_Selfie_05-00 BacklitPortraitAsianFair_AppleiPhone14Pro_DxOMark_Selfie_05-00 BacklitPortraitAsianFair_GooglePixel6Pro_DxOMark_Selfie_05-00 BacklitPortraitAsianFair_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 VegetationFairToLight_AppleiPhone14Pro_DxOMark_Selfie_05-00 VegetationFairToLight_GooglePixel6Pro_DxOMark_Selfie_05-00 VegetationFairToLight_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 BlueSky_AppleiPhone14Pro_DxOMark_Selfie_05-00 Best CafetWall_AppleiPhone14Pro_DxOMark_Selfie_05-00 Best Best Best Best
Huawei P50 Pro Selfie test https://www.dxomark.com/huawei-p50-pro-selfie-test-retested/ https://www.dxomark.com/huawei-p50-pro-selfie-test-retested/#respond Tue, 20 Sep 2022 16:59:23 +0000 https://www.dxomark.com/?p=124269&preview=true&preview_id=124269 We put the Huawei P50 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing [...]

The post Huawei P50 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Huawei P50 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 13 MP 1/2.8″ sensor with 1.22µm pixel size
  • 18 mm-equivalent lens with f/2.4 aperture
  • PDAF, EIS
  • 4K video, 2160p/60 fps, (2160p/30 fps tested)

Scoring

Sub-scores and attributes included in the calculations of the global score.


Huawei P50 Pro
144
selfie
144
photo
90

92

99

105

97

100

75

79

89

94

77

89

90

93

70

80

146
video
79

86

85

90

90

92

74

97

70

83

87

92

81

82

Pros

  • Good face exposure in photos and videos
  • Wide dynamic range in photos
  • Accurate white balance in photos
  • Wide focus range
  • Low noise levels in photos
  • Good detail in indoor and daylight photos
  • Effective video stabilization
  • Good focus for video group shots indoors and in daylight

Cons

  • Loss of fine detail in low-light photos
  • Occasional autofocus failures in photo mode
  • Occasionally inaccurate skin tone rendering in photos
  • Limited dynamic range in low-light videos
  • Noise on faces in indoor and low-light videos

The Huawei P50 Pro beats its stablemate Mate 40 Pro and catapults to the top of the DXOMARK front camera ranking.

Thanks to outstanding performance in pretty much all test areas, the P50 Pro front camera is the best or close to the best for all Photo sub-attributes except artifacts, where its score is slightly lower than some competitors due to color quantization, which appears especially frequently in low light.

For video, the P50 Pro matches performance of the Mate 40 Pro and the Asus Zenfone 7 Pro for the top spot in the category. The P50 Pro is an excellent choice overall for shooting selfie videos, with outstanding results in most areas, including focus and color. However, he texture/noise trade-off is not quite up there with the very best.

[glossary_exclude]Huawei P50 Pro Selfie Scores vs Ultra-Premium[/glossary_exclude]
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

[glossary_exclude]Photo[/glossary_exclude]

144

Huawei P50 Pro

147

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]
[glossary_exclude]Huawei P50 Pro Photo scores vs Ultra-Premium[/glossary_exclude]
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

[glossary_exclude]Exposure[/glossary_exclude]

90

Huawei P50 Pro

92

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

99

Huawei P50 Pro

105

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

This graph shows the Huawei P50 Pro’s exposure performance across different light levels.

Exposure comparison: The P50 Pro manages good exposure even in extremely low light of 1 lux.

These samples show the Huawei P50 Pro’s exposure performance in an outdoor scene.

Huawei P50 Pro, accurate target exposure, wide dynamic range
Huawei Mate 40 Pro, accurate target exposure, slightly less dynamic range
Asus Zenfone 7 Pro, limited dynamic range, highlight clipping

These samples show the Huawei P50 Pro’s color performance in daylight.

Huawei P50 Pro, accurate white balance and color rendering, neutral skin tones, among the very best for color
Huawei Mate 40 Pro, slightly inaccurate color rendering (reds), yellowish skin tones
Asus Zenfone 7 Pro, accurate white balance and color rendering, natural skin tones

[glossary_exclude]Focus[/glossary_exclude]

97

Huawei P50 Pro

100

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

These samples show the Huawei P50 Pro’s focus performance in daylight. [glossary_exclude]

Huawei P50 Pro, depth of field
Huawei P50 Pro, crop: very accurate focus, wide depth of field, good detail on background
Huawei Mate 40 Pro, depth of field
Huawei Mate 40 Pro, crop: stable focus (fixed focus), lack of detail on background
Asus Zenfone 7 Pro, depth of field
Asus Zenfone 7 Pro, crop: stable autofocus, narrow depth of field and lack of detail on background
[/glossary_exclude]

[glossary_exclude]Texture[/glossary_exclude]

75

Huawei P50 Pro

79

[glossary_exclude]Asus ZenFone 7 Pro[/glossary_exclude]

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

This graph shows the Huawei P50 Pro’s texture performance in the lab across different light levels.

Texture comparison: Huawei P50 Pro shows a lot of detail in indoor and outdoor conditions but struggles slightly to maintain high levels of detail in low light.

These samples show the Huawei P50 Pro’s texture performance at a light level of 100 lux and a shooting distance of 55 cm.

Huawei P50 Pro, texture, 55 cm
Huawei P50 Pro, crop: good detail on the face
Huawei Mate 40 Pro, texture, 55 cm
Huawei Mate 40 Pro, crop: slightly better detail
Asus Zenfone 7 Pro, 55 cm
Asus Zenfone 7 Pro, crop: better detail than competitors

[glossary_exclude]Noise[/glossary_exclude]

89

Huawei P50 Pro

94

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

This graph shows the Huawei P50 Pro’s noise performance in the lab across different light levels.

Visual noise is a metric that measures noise as perceived by end-users. It takes into account the [glossary_exclude]sensitivity[/glossary_exclude] of the human eye to different [glossary_exclude]spatial[/glossary_exclude] frequencies under different viewing conditions.

Noise comparison: Despite its fairly small sensor, the Huawei P50 Pro keeps noise levels very low, even in low light.

[glossary_exclude]Artifacts[/glossary_exclude]

77

Huawei P50 Pro

89

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

[glossary_exclude]Main photo artifacts penalties[/glossary_exclude]

This sample shows unnatural texture artifacts in low light.

Huawei P50 Pro, artifacts
Huawei P50 Pro, crop: unnatural texture rendering on skin and facial hair

[glossary_exclude]Bokeh[/glossary_exclude]

70

Huawei P50 Pro

80

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

These samples show the Huawei P50 Pro’s bokeh mode performance in an indoor scene.

Huawei P50 Pro, few depth estimation artifacts but no blur gradient
Huawei Mate 40 Pro, few depth estimation artifacts, blur gradient with slightly abrupt transitions
Asus Zenfone 7 Pro, few depth estimation artifacts, very natural blur gradient

[glossary_exclude]Video[/glossary_exclude]

146

Huawei P50 Pro

154

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

[glossary_exclude]Huawei P50 Pro Video scores vs Ultra-Premium[/glossary_exclude]

The Huawei P50 Pro achieves a Selfie Video score of 96. A device’s overall Video score is derived from its performance and results across a range of attributes in the same way as the Photo score. In this section, we take a closer look at these sub-scores and compare video image quality against competitors.

[glossary_exclude]Exposure[/glossary_exclude]

79

Huawei P50 Pro

86

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

85

Huawei P50 Pro

90

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

These video stills show the Huawei P50 Pro’s video exposure performance in indoor light conditions.

Huawei P50 Pro, video still, good exposure and wide dynamic range even in this challenging backlit scene
Huawei Mate 40 Pro, video still, occasional underexposure in high-contrast scenes

These video stills show the Huawei P50 Pro’s video color in a lab scene.

Huawei P50 Pro, video still, fairly natural skin tones, but red tones can be slightly inaccurate
Huawei Mate 40 Pro, video still, fairly accurate color rendering and skin tones
Asus Zenfone 7 Pro, video still, accurate color rendering and skin tones

[glossary_exclude]Texture[/glossary_exclude]

74

Huawei P50 Pro

97

[glossary_exclude]Asus ZenFone 6[/glossary_exclude]

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Texture preservation in video is good, but slightly surpassed by the other devices in this comparison.

These video stills show the Huawei P50 Pro’s video texture performance in low light.

Huawei P50 Pro, video texture
Huawei P50 Pro, crop: not as much detail as on Asus Zenfone 7 Pro
Huawei Mate 40 Pro, video texture
Huawei Mate 40 Pro, crop: similar detail to P50 Pro
Asus Zenfone 7 Pro, video texture
Asus Zenfone 7 Pro, better detail

[glossary_exclude]Noise[/glossary_exclude]

70

Huawei P50 Pro

83

[glossary_exclude]Xiaomi Mi 11 Ultra[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

[glossary_exclude]Spatial visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
[glossary_exclude]Temporal visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

These video stills show the Huawei P50 Pro’s video noise performance in indoor light conditions.

Huawei P50 Pro, video noise
Huawei P50 Pro, crop: noise is visible in indoor and low-light conditions
Huawei Mate 40 Pro, video noise
Huawei Mate 40 Pro, crop: very low noise
Asus Zenfone 7 Pro, video noise
Asus Zenfone 7 Pro, crop: well-controlled noise in most conditions

[glossary_exclude]Stabilization[/glossary_exclude]

81

Huawei P50 Pro

82

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jello artifacts, during walk and panning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

This sample clip shows the Huawei P50 Pro’s video stabilization in outdoor conditions.

Stabilization on Huawei P50 Pro is very efficient with almost no frame shifts and very little difference in sharpness between frames.

[glossary_exclude]Artifacts[/glossary_exclude]

87

Huawei P50 Pro

92

[glossary_exclude]Apple iPhone 12 mini[/glossary_exclude]

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

[glossary_exclude]Main video artifacts penalties[/glossary_exclude]

This graph shows the Huawei P50 Pro’s ringing video output. This curve displays the normalized edge profile of the maximum ringing in the field.

Ringing: the P50 Pro controls ringing noticeably better than the Mate 40 Pro and the Zenfone 7 Pro.

The post Huawei P50 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/huawei-p50-pro-selfie-test-retested/feed/ 0 SELFIE SELFIE graph_exposure GroufieLineOutdoor_HuaweiP50Pro_DxOMark_Selfie_04-00 GroufieLineOutdoor_HuaweiMate40Pro_DxOMark_Selfie_05-00 GroufieLineOutdoor_AsusZF7pro_DxOMark_Selfie_05-00 ColoredPannels_HuaweiP50Pro_DxOMark_Selfie_06-00 ColoredPannels_HuaweiMate40Pro_DxOMark_Selfie_06-00 ColoredPannels_AsusZF7pro_DxOMark_Selfie_06-00 graph_texture graph_noise CafetWall_HuaweiP50Pro_DxOMark_Selfie_05-00 CafetWall_HuaweiMate40Pro_DxOMark_Selfie_05-00 CafetWall_AsusZF7pro_DxOMark_Selfie_05-00 PanningIndoor HuaweiP50Pro DxOMark_Selfie-00_00_142021-07-20-17h55m40s978 PanningIndoor HuaweiMate40Pro DxOMark_Selfie-00_00_142021-07-20-17h57m03s694 HDR_d65_700 HuaweiP50Pro DxOMark_Selfie-00_00_082021-07-20-17h58m14s031 HDR_d65_700 HuaweiMate40Pro DxOMark_Selfie-00_00_082021-07-20-17h58m21s495 HDR_d65_700 AsusZF7pro DxOMark_Selfie-00_00_072021-07-20-17h58m32s125 graph_ringing
Apple iPhone 13 Pro Max Selfie test https://www.dxomark.com/apple-iphone-13-pro-max-selfie-test-retested/ https://www.dxomark.com/apple-iphone-13-pro-max-selfie-test-retested/#respond Tue, 20 Sep 2022 16:58:33 +0000 https://www.dxomark.com/?p=124271&preview=true&preview_id=124271 We put the Apple iPhone 13 Pro Max through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of [...]

The post Apple iPhone 13 Pro Max Selfie test appeared first on DXOMARK.

]]>
We put the Apple iPhone 13 Pro Max through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 12 MP 1/3.6″ sensor, 23 mm equivalent f/2.2-aperture lens
  • 3D sensor
  • Cinematic mode for recording videos with shallow depth of field (1080p at 30 fps)
  • HDR video recording with Dolby Vision up to 4K at 60 fps; 4K video recording at 24/ 25/ 30/ 60 fps; 1080p HD video recording at 25 fps, 30 fps, or 60 fps

Scoring

Sub-scores and attributes included in the calculations of the global score.


Apple iPhone 13 Pro Max
134
selfie
128
photo
86

92

84

105

91

100

72

79

62

94

83

89

73

93

75

80

143
video
86

Best

87

90

89

92

73

97

56

83

87

92

78

82

Pros

  • Accurate target exposure on face
  • Nice color and skin tones
  • Good detail in indoor and outdoor shots
  • Quite accurate depth estimation in bokeh mode
  • Wide dynamic range and accurate target exposure in video
  • Accurate white balance in video
  • Wide focus range in video means all subjects are in focus

Cons

  • Luminance noise
  • Occasional white balance casts
  • Clipping in challenging backlit scenes
  • Fixed focus means subjects further away from the camera are out of focus
  • High noise levels in video, especially in low light
  • Residual motion in walking videos
  • Loss of detail in low light video

With all iPhone 13 series devices sharing the same front camera specs and processor, it’s fair to assume the Apple iPhone 13 Pro Max Selfie results are very close to those of the iPhone 13 Pro. We have confirmed this by putting the Apple iPhone 13 Pro Max through the complete DXOMARK Selfie test protocol.

In this outdoor shot, you can see that the image output of the two cameras is pretty much identical, with the same high level of detail and good subject exposure. There is some highlight clipping in the brighter background but overall the iPhone front camera deals really well with this difficult backlit scene.

Apple iPhone 13 Pro Max, outdoor selfie
Apple iPhone 13 Pro Max, crop: good detail and subject exposure but highlight clipping in the background
Apple iPhone 13 Pro, outdoor selfie
Apple iPhone 13 Pro, crop: pretty much identical image quality

Not only still image quality is very similar on the 13 Pro Max and 13 Pro. The same is true for video footage. In this example, you can see that the subject is well exposed but the stabilization system cannot counteract all walking motion.

Apple iPhone 13 Pro Max, good target exposure, some residual motion

Apple iPhone 13 Pro, very similar video performance to the Pro Max

Given the pretty much identical results, we are posting only this short article for the Apple iPhone 13 Pro Max. For the full set of sample images and measurements as well as a complete analysis, please click on the link below and read the full review of the Apple iPhone 13 Pro.

Go to the Apple iPhone 13 Pro Selfie review

The post Apple iPhone 13 Pro Max Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/apple-iphone-13-pro-max-selfie-test-retested/feed/ 0 Best SELFIE SELFIE
Apple iPhone 13 Pro Selfie test https://www.dxomark.com/apple-iphone-13-pro-selfie-test-retested/ https://www.dxomark.com/apple-iphone-13-pro-selfie-test-retested/#respond Tue, 20 Sep 2022 16:57:09 +0000 https://www.dxomark.com/?p=124424&preview=true&preview_id=124424 We put the Apple iPhone 13 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Apple iPhone 13 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Apple iPhone 13 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 12 MP 1/3.6″ sensor, 23 mm equivalent f/2.2-aperture lens
  • 3D sensor
  • Cinematic mode for recording videos with shallow depth of field (1080p at 30 fps)
  • HDR video recording with Dolby Vision up to 4K at 60 fps; 4K video recording at 24/ 25/ 30/ 60 fps; 1080p HD video recording at 25 fps, 30 fps, or 60 fps

Scoring

Sub-scores and attributes included in the calculations of the global score.


Apple iPhone 13 Pro
134
selfie
128
photo
86

92

84

105

91

100

72

79

62

94

83

89

73

93

75

80

143
video
86

Best

87

90

89

92

73

97

56

83

87

92

78

82

Pros

  • Accurate target exposure on face
  • Wide depth of field
  • High level of detail in indoor and outdoor conditions
  • Quite accurate depth estimation
  • Wide dynamic range and accurate target exposure in video
  • Accurate video white balance
  • Wide focus range means all subjects are in focus in group video selfies.

Cons

  • Luminance noise
  • Occasionally inaccurate skin tones, especially in backlit indoor scenes
  • Slight anamorphosis artifacts (perspective distortion on faces)
  • Low subject exposure when using the flash
  • High noise levels in video, especially in low light
  • Residual motion in walking videos
  • Loss of detail in low-light videos

The Apple iPhone 13 Pro makes it into the upper regions but not quite to the top of the DXOMARK Selfie ranking. Overall performance is very similar to last year’s iPhone 12 series — not a surprise given the similar front camera hardware.

Still, Apple has managed to improve things slightly, thanks to accurate subject exposure, a wide depth of field, and good detail in bright light and indoor shooting. On the downside, the 13 Pro is a little noisier than its predecessor.

The difference is a little more noticeable for video, making the iPhone 13 Pro one of the best devices in this category. For Video, Apple managed to improve exposure, and the color response is now more stable. The new device also has a wider dynamic range in video.

[glossary_exclude]Apple iPhone 13 Pro Selfie Scores vs Ultra-Premium[/glossary_exclude]
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

[glossary_exclude]Photo[/glossary_exclude]

128

Apple iPhone 13 Pro

147

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]
[glossary_exclude]Apple iPhone 13 Pro Photo scores vs Ultra-Premium[/glossary_exclude]
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

[glossary_exclude]Exposure[/glossary_exclude]

86

Apple iPhone 13 Pro

92

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

84

Apple iPhone 13 Pro

105

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

Target exposure on the face is generally accurate. These samples show the Apple iPhone 13 Pro’s exposure performance in an outdoor scene.

Apple iPhone 13 Pro, accurate target exposure for the face
Apple iPhone 12 Pro Max, accurate target exposure for the face
Huawei P50 Pro, accurate target exposure for the face

Color is generally good but our testers observed some inaccuracies in backlit mixed lighting conditions. These samples show the Apple iPhone 13 Pro’s color performance in an indoor scene.

Apple iPhone 13 Pro, orange skin tone rendering
Apple iPhone 12 Pro Max, orange skin tone rendering
Huawei P50 Pro, accurate skin tone rendering

[glossary_exclude]Focus[/glossary_exclude]

91

Apple iPhone 13 Pro

100

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

This graph compares the Apple iPhone 13 Pro’s focus performance at varying subject distances.

The iPhone 13 Pro’s fixed-focus lens captures high levels of acutance at all distances

[glossary_exclude]Texture[/glossary_exclude]

72

Apple iPhone 13 Pro

79

[glossary_exclude]Asus ZenFone 7 Pro[/glossary_exclude]

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

This graph shows the Apple iPhone 13 Pro’s texture performance in the lab across different light levels.

Texture comparison: High levels of detail are generally captured in outdoor (1000 lux) and indoor (100 lux) lighting conditions.

These samples show the Apple iPhone 13 Pro’s texture performance in an outdoor selfie at a shooting distance of 55 cm.

Apple iPhone 13 Pro, texture, 55 cm
Apple iPhone 13 Pro, crop: high level of detail
Apple iPhone 12 Pro Max, texture, 55 cm
Apple iPhone 12 Pro Max, crop: high level of detail
Huawei P50 Pro, texture, 55 cm
Huawei P50 Pro, crop: high level of detail

[glossary_exclude]Noise[/glossary_exclude]

62

Apple iPhone 13 Pro

94

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

This graph shows the Apple iPhone 13 Pro’s noise performance in the lab across different light levels.

Visual noise is a metric that measures noise as perceived by end-users. It takes into account the [glossary_exclude]sensitivity[/glossary_exclude] of the human eye to different [glossary_exclude]spatial[/glossary_exclude] frequencies under different viewing conditions.

Noise comparison: Luminance noise is visible in most tested conditions.
These samples show the Apple iPhone 13 Pro’s noise performance under indoor lighting conditions.
Apple iPhone 13 Pro, visual noise
Apple iPhone 13 Pro, crop: noise is visible
Apple iPhone 12 Pro Max, visual noise
Apple iPhone 12 Pro Max, crop: noise is visible
Huawei P50 Pro, visual noise
Huawei P50 Pro, crop: noise is well-controlled

 

[glossary_exclude]Artifacts[/glossary_exclude]

83

Apple iPhone 13 Pro

89

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

These samples show slight instabilities in distortion correction over consecutive shots. The camera was mounted on a tripod to capture these samples. If you look closely at the edges you can see the field of view is slightly wider on the left image.

Apple iPhone 13 Pro, slightly wider field of view
Apple iPhone 13 Pro, slightly narrower field of view
[glossary_exclude]Main photo artifacts penalties[/glossary_exclude]

[glossary_exclude]Bokeh[/glossary_exclude]

75

Apple iPhone 13 Pro

80

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

These samples show the Apple iPhone 13 Pro’s bokeh mode performance in an outdoor scene.

Apple iPhone 13 Pro, accurate depth estimation for the subject
Apple iPhone 12 Pro Max, similar results to the iPhone 13 Pro
Huawei P50 Pro, no blur gradient but very good depth estimation

[glossary_exclude]Video[/glossary_exclude]

143

Apple iPhone 13 Pro

154

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

[glossary_exclude]Apple iPhone 13 Pro Video scores vs Ultra-Premium[/glossary_exclude]
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

The Apple iPhone 13 Pro achieves a Selfie Video score of 95. A device’s overall Video score is derived from its performance and results across a range of attributes in the same way as the Photo score. In this section, we take a closer look at these sub-scores and compare video image quality against competitors.

[glossary_exclude]Exposure[/glossary_exclude]

86

Apple iPhone 13 Pro

Best

[glossary_exclude][/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

87

Apple iPhone 13 Pro

90

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

These video samples show the Apple iPhone 13 Pro’s video exposure performance under indoor lighting conditions.

Apple iPhone 13 Pro, accurate target exposure with wide dynamic range. Exposure adaption is also fast and smooth with no visible instabilities

Huawei P50 Pro, lower face exposure

These video samples show the Apple iPhone 13 Pro’s video color performance in an outdoor scene.

Apple iPhone 13 Pro, pleasant and natural skin tone rendering with neutral white balance

Apple iPhone 12 Pro Max, very similar to the iPhone 13 Pro with slightly higher saturation

Huawei P50 Pro, cold white balance with inaccurate rendering on fair to medium skin types

[glossary_exclude]Texture[/glossary_exclude]

73

Apple iPhone 13 Pro

97

[glossary_exclude]Asus ZenFone 6[/glossary_exclude]

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Texture preservation in video is good, but slightly surpassed by the other devices in this comparison.

These video samples show the Apple iPhone 13 Pro’s video texture performance under 100 lux lighting conditions and a distance of 55cm in the lab.

Apple iPhone 13 Pro, the level of detail is good, but lower than the Samsung Galaxy S21 Ultra 5G (Snapdragon)

Apple iPhone 12 Pro Max, the level of detail is good, but lower than the Apple iPhone 13 Pro

Samsung Galaxy S21 Ultra 5G (Snapdragon), high level of texture with lots of fine detail captured

[glossary_exclude]Noise[/glossary_exclude]

56

Apple iPhone 13 Pro

83

[glossary_exclude]Xiaomi Mi 11 Ultra[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

[glossary_exclude]Spatial visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
[glossary_exclude]Temporal visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

These video stills show the Apple iPhone 13 Pro’s video noise performance in indoor light conditions.

Apple iPhone 13 Pro, strong luminance noise is visible in all conditions and especially in low light

Apple iPhone 12 Pro Max, strong luminance noise is also visible but it’s slightly better than on the 13 Pro

Huawei P50 Pro, noise reduction is very efficient, even in extreme low light videos

[glossary_exclude]Stabilization[/glossary_exclude]

78

Apple iPhone 13 Pro

82

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

This sample clip shows the Apple iPhone 13 Pro’s video stabilization in outdoor conditions.

Apple iPhone 13 Pro, some residual motion is visible in the background on videos captured whilst walking

Apple iPhone 12 Pro Max, very similar stabilization behavior to the Apple iPhone 13 Pro

Huawei P50 Pro, stabilization is more effective compared to both iPhones with less residual motion visible

[glossary_exclude]Artifacts[/glossary_exclude]

87

Apple iPhone 13 Pro

92

[glossary_exclude]Apple iPhone 12 mini[/glossary_exclude]

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

[glossary_exclude]Main video artifacts penalties[/glossary_exclude]

This graph shows the Apple iPhone 13 Pro’s ringing video output. This curve displays the normalized edge profile of the maximum ringing in the field.

Ringing: lab measurements show the iPhone 13 Pro displays more ringing than some of its competitors but no other serious artifacts are visible in its videos

The post Apple iPhone 13 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/apple-iphone-13-pro-selfie-test-retested/feed/ 0 Best SELFIE SELFIE Amadeus55cm_AppleiPhone13Pro_DXOMARK_Selfie_05-00 Amadeus55cm_AppleiPhone12ProMax_DxOMark_Selfie_05-00 Amadeus55cm_HuaweiP50Pro_DxOMark_Selfie_07-00 BacklitPortraitMedium_AppleiPhone13Pro_DXOMARK_Selfie_06-00 BacklitPortraitMedium_AppleiPhone12ProMax_DxOMark_Selfie_05-00 BacklitPortraitMedium_HuaweiP50Pro_DxOMark_Selfie_05-00 focus_graph_b texture_photo_graph noise_photo_graph_b Groufie Labo 10 lux A (1) Groufie Labo 10 lux A (2) Street_AppleiPhone13Pro_DXOMARK_Selfie_07-00 Street_AppleiPhone12ProMax_DxOMark_Selfie_06-00 Street_HuaweiP50Pro_DxOMark_Selfie_07-00 Best video_artifacts_graph
Google Pixel 6 Pro Selfie test https://www.dxomark.com/google-pixel-6-pro-selfie-test-retested/ https://www.dxomark.com/google-pixel-6-pro-selfie-test-retested/#respond Tue, 20 Sep 2022 16:56:23 +0000 https://www.dxomark.com/?p=124268&preview=true&preview_id=124268 We put the Google Pixel 6 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Google Pixel 6 Pro Selfie test appeared first on DXOMARK.

]]>
We put the Google Pixel 6 Pro through our rigorous DXOMARK Selfie test suite to measure its performance in photo and video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 11.1 MP sensor, 1.22 μm pixels
  • f/2.2 aperture
  • 94° field of view
  • Fixed focus
  • 4K/30 fps, 1080p/30fps

Scoring

Sub-scores and attributes included in the calculations of the global score.


Google Pixel 6 Pro
138
selfie
135
photo
90

92

100

105

84

100

62

79

80

94

87

89

85

93

65

80

143
video
79

86

86

90

90

92

75

97

62

83

86

92

82

Best

Pros

  • Generally good exposure in photo and video
  • Accurate white balance and nice color
  • Well-controlled noise
  • Pretty wide dynamic range in video
  • Neutral white balance and nice skin tones in video
  • Effective video stabilization

Cons

  • Loss of fine detail
  • Face out of focus at close distance (30 cm)
  • No blur gradient in bokeh mode
  • Noise in video clips, especially in low light
  • Lack of detail in low-light video
  • Hue shifts on face and over sharpening in bright light and indoor video

The Google Pixel 6 Pro offers the best Selfie camera currently available in the US market, besting such esteemed competition as Apple’s new iPhone 13 series or the Samsung Galaxy S21 Ultra. It also takes a position very close to the top in our global ranking where it is only surpassed by the recent phones from Huawei.

The excellent Photo score is based on a great performance for exposure and color. Google’s HDR+ system delivers nicely exposed portrait subjects and good contrast, even in scenes with strong backlighting. Skin tones are rendered nicely for any type of skin and in all light conditions. Image artifacts are overall very well under control, too.

The video score is also one of the best we have seen. Stabilization stands out in this category, with excellent stabilization when handholding the device and of movement of the face in the frame. Dynamic range is good, too, but not quite on the same high level as the latest Apple devices.

Overall the Pixel 6 Pro’s front camera hardware design delivers an excellent trade-off between a wide depth of field that keeps all subjects in group shots in focus, and high light sensitivity, which helps produce good image quality in difficult low light scenes. It’s therefore an easy recommendation to any passionate selfie shooter.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

[glossary_exclude]Photo[/glossary_exclude]

135

Google Pixel 6 Pro

147

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]
[glossary_exclude]Google Pixel 6 Pro Photo scores vs Ultra-Premium[/glossary_exclude]
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

[glossary_exclude]Exposure[/glossary_exclude]

90

Google Pixel 6 Pro

92

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

100

Google Pixel 6 Pro

105

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

These samples show the Google Pixel 6 Pro’s exposure performance in bright light. Target exposure is generally accurate and more consistent across consecutive shots than on the competitors. Dynamic range is fairly wide and shadow contrast is better than on the comparison devices.

Google Pixel 6 Pro, accurate target exposure and fairly wide dynamic range, excellent contrast in the shadows (hair and background)
Apple iPhone 13 Pro Max, accurate target exposure and fairly wide dynamic range
Huawei P50 Pro, accurate target exposure and fairly wide dynamic range

This graph shows the Google Pixel 6 Pro’s exposure performance across light levels.

Exposure comparison: the Pixel 6 Pro achieves a brighter exposure in low light than the iPhone 13 Pro and the measured target exposure is generally high.

These samples show the Google Pixel 6 Pro’s color performance in bright light. Skin tones and color are generally accurate. While many devices struggle to produce accurate white balance in scenes with monochromatic backgrounds, the Pixel 6 Pro delivers better results in such conditions than the iPhone 13 Pro and Huawei P50 Pro.

Google Pixel 6 Pro, neutral white balance, nice skin tones
Apple iPhone 13 Pro Max, warm white balance and skin tones
Huawei P50 Pro, desaturated skin tones, accurate white balance

These samples show the Google Pixel 6 Pro’s color performance in an indoor setting. In this kind of scene, white balance is generally neutral with nice skin tones across all types of skins, even in challenging high-contrast shots. The Apple iPhone 13 Pro generally has a white balance cast with orange skin tone rendering.

Google Pixel 6 Pro, neutral white balance, accurate skin tones
Apple iPhone 13 Pro Max, white balance cast, orange skin tones
Huawei P50 Pro, neutral white balance, accurate skin tones

[glossary_exclude]Focus[/glossary_exclude]

84

Google Pixel 6 Pro

100

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

These samples show the Google Pixel 6 Pro’s focus performance at a subject distance of 30 cm. At this close distance, the face is slightly out of focus, with lower sharpness than the comparison devices. At 120 cm (selfie stick distance) sharpness is on the same level as the iPhone 13 Pro.

Google Pixel 6 Pro, focus
Google Pixel 6 Pro, crop: face slightly out of focus
Apple iPhone 13 Pro, focus
Apple iPhone 13 Pro, crop: face in focus
Huawei P50 Pro, focus
Huawei P50 Pro, crop: face in focus

[glossary_exclude]Texture[/glossary_exclude]

62

Google Pixel 6 Pro

79

[glossary_exclude]Asus ZenFone 7 Pro[/glossary_exclude]

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

This graph shows the Google Pixel 6 Pro’s texture performance in the lab across different light levels. Measured texture acutance is slightly lower than for the iPhone and Huawei, especially in scenes with motion. This results in more loss of fine detail than on the comparison devices.

Texture comparison: The Pixel 6 Pro front camera delivers high acutance, comparable with the Huawei P50 Pro.

These samples show the Google Pixel 6 Pro’s texture performance indoors.

Google Pixel 6 Pro , indoor texture
Google Pixel 6 Pro , crop: loss of fine detail
Apple iPhone 13 Pro, indoor texture
Apple iPhone 13 Pro, crop: fine detail is preserved
Huawei P50 Pro, indoor texture
Huawei P50 Pro, crop: very fine detail is preserved
[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

[glossary_exclude]Noise[/glossary_exclude]

80

Google Pixel 6 Pro

94

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

This graph shows the Google Pixel 6 Pro’s noise performance in the lab across different light levels.

Visual noise is a metric that measures noise as perceived by end-users. It takes into account the [glossary_exclude]sensitivity[/glossary_exclude] of the human eye to different [glossary_exclude]spatial[/glossary_exclude] frequencies under different viewing conditions.

Noise comparison: Noise is well controlled on the Pixel 6 Pro and Huawei P50 Pro. Noise is noticeable on the Apple iPhone 13 Pro.
These samples show the Google Pixel 6 Pro’s noise performance under indoor lighting conditions.
Google Pixel 6 Pro, visual noise
Google Pixel 6 Pro, crop: noise is well controlled
Apple iPhone 13 Pro Max, visual noise
Apple iPhone 13 Pro Max, crop: luminance noise
Huawei P50 Pro, visual noise
Huawei P50 Pro, crop: noise is well controlled

 

[glossary_exclude]Artifacts[/glossary_exclude]

87

Google Pixel 6 Pro

89

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

[glossary_exclude]Main photo artifacts penalties[/glossary_exclude]

Overall our testers observed few artifacts on the Pixel 6 Pro and in this respect, the Google phone does better than many of its competitors. In these samples you can see ghosting artifacts and white spots in low light.

Google Pixel 6 Pro, artifacts
Google Pixel 6 Pro, crop: white spots
Google Pixel 6 Pro, artifacts
Google Pixel 6 Pro, crop: ghosting can be visible

[glossary_exclude]Bokeh[/glossary_exclude]

65

Google Pixel 6 Pro

80

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

These samples show the Google Pixel 6 Pro’s bokeh mode performance in an outdoor scene.

Google Pixel 6 Pro, no blur gradient, slight depth estimation artifacts
Apple iPhone 13 Pro Max, blur gradient is applied, slight depth estimation artifacts
Huawei P50 Pro, blur gradient is applied, slight depth estimation artifacts

[glossary_exclude]Video[/glossary_exclude]

143

Google Pixel 6 Pro

154

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

[glossary_exclude]Google Pixel 6 Pro Video scores vs Ultra-Premium[/glossary_exclude]
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

[glossary_exclude]Exposure[/glossary_exclude]

79

Google Pixel 6 Pro

86

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

86

Google Pixel 6 Pro

90

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Video target exposure is generally accurate, even in low light. Dynamic range is fairly wide but not as wide as on the Apple iPhone 13 series. These video samples show the Google Pixel 6 Pro’s video exposure performance in outdoor conditions.

Google Pixel 6 Pro, accurate target exposure on face, wide dynamic range

Apple iPhone 13, wider dynamic range, better shadow detail

Huawei P50 Pro, high contrast on face, shadow clipping

In video, the camera usually produces nice color and skin tones with a neutral white balance. These video samples show the Google Pixel 6 Pro’s video color performance in an outdoor scene.

Google Pixel 6 Pro, neutral white balance, accurate skin tones

Apple iPhone 13, noticeable but acceptable yellow cast

Huawei P50 Pro, slight green white balance cast, inaccurate skin tones

[glossary_exclude]Texture[/glossary_exclude]

75

Google Pixel 6 Pro

97

[glossary_exclude]Asus ZenFone 6[/glossary_exclude]

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

These video samples show the Google Pixel 6 Pro’s video texture performance under 1000 lux lighting conditions and at a subject distance of 55cm.

Google Pixel 6 Pro, good texture and detail

Apple iPhone 13 Pro Max, slightly less detail

Huawei P50 Pro, good texture and detail

[glossary_exclude]Noise[/glossary_exclude]

62

Google Pixel 6 Pro

83

[glossary_exclude]Xiaomi Mi 11 Ultra[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

[glossary_exclude]Spatial visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
[glossary_exclude]Temporal visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Noise is generally visible on Google Pixel 6 Pro video cls, especially in low light. The Huawei P50 Pro is able to output images with lower levels of noise in comparison.

These video samples show the Google Pixel 6 Pro’s video noise performance in low light conditions.

Google Pixel 6 Pro, coarse luminance noise

Apple iPhone 13 Pro Max, high level of noise but slightly lower than on Pixel 6 Pro

Huawei P50 Pro, lower noise

[glossary_exclude]Stabilization[/glossary_exclude]

82

Google Pixel 6 Pro

Best

[glossary_exclude][/glossary_exclude]

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jellow artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Stabilization on the Google Pixel 6 Pro is generally effective, but some camera shake is still noticeable on faces when walking while recording. Overall the Pixel’s performance is quite similar to the P50 Pro. Both devices stabilize the background. In contrast, the iPhone 13 stabilizes the face and shows more camera shake. This sample clip shows the Google Pixel 6 Pro’s video stabilization in outdoor conditions.

Google Pixel 6 Pro, effective stabilization while walking

Apple iPhone 13 Pro Max, more motion than comparison devices

Huawei P50 Pro, effective stabilization while walking

[glossary_exclude]Artifacts[/glossary_exclude]

86

Google Pixel 6 Pro

92

[glossary_exclude]Apple iPhone 12 mini[/glossary_exclude]

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

[glossary_exclude]Main video artifacts penalties[/glossary_exclude]

Some unnatural rendering artifacts are sometimes visible due to over-sharpening. Hue shifts close to clipped areas can be visible as well. This sample clip was recorded in the lab at 1000 lux.

Google Pixel 6 Pro, hue shift close to clipped areas in bright lab conditions.

The post Google Pixel 6 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/google-pixel-6-pro-selfie-test-retested/feed/ 0 Best SELFIE SELFIE EiffelTowerFromTrocadero_GooglePixel6Pro_DxOMark_Selfie_05-00_NA2 EiffelTowerFromTrocadero_AppleiPhone13Pro_DxOMark_Selfie_05-00 EiffelTowerFromTrocadero_HuaweiP50Pro_DxOMark_Selfie_05-00 graph_expo VegetationFairToLight_GooglePixel6Pro_DxOMark_Selfie_05-00_NA2 VegetationFairToLight_AppleiPhone13_DxOMark_Selfie_05-00 VegetationFairToLight_HuaweiP50Pro_DxOMark_Selfie_05-00 BacklitPortraitDeep_GooglePixel6Pro_DxOMark_Selfie_05-00_NA2 BacklitPortraitDeep_AppleiPhone13_DxOMark_Selfie_05-00 BacklitPortraitDeep_HuaweiP50Pro_DxOMark_Selfie_05-00 graph_text graph_noise Street_GooglePixel6Pro_DxOMark_Selfie_P05_05-00_NA2 Street_AppleiPhone13_DxOMark_Selfie_05-00 Street_HuaweiP50Pro_DxOMark_Selfie_05-00 Best
Samsung Galaxy S22 Ultra (Exynos) Selfie test https://www.dxomark.com/samsung-galaxy-s22-ultra-exynos-selfie-test-retested/ https://www.dxomark.com/samsung-galaxy-s22-ultra-exynos-selfie-test-retested/#respond Tue, 20 Sep 2022 16:55:00 +0000 https://www.dxomark.com/?p=124270&preview=true&preview_id=124270 We put the Samsung Galaxy S22 Ultra (Exynos) through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our [...]

The post Samsung Galaxy S22 Ultra (Exynos) Selfie test appeared first on DXOMARK.

]]>
We put the Samsung Galaxy S22 Ultra (Exynos) through our rigorous DXOMARK Selfie test suite to measure its performance in photo and  video from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key front camera specifications:

  • 40MP 1/2.82″ sensor
  • f/2.2 aperture
  • 26mm equivalent focal length
  • PDAF
  • 4K at 30/60 fps, 1080p at 30fps

Scoring

Sub-scores and attributes included in the calculations of the global score.


Samsung Galaxy S22 Ultra (Exynos)
135
selfie
130
photo
88

92

92

105

95

100

74

79

52

94

82

89

86

93

75

80

144
video
81

86

84

90

83

92

88

97

64

83

88

92

74

82

Pros

  • Good exposure and wide dynamic range in photo and video
  • Accurate white balance and nice color
  • Fast and repeatable autofocus
  • Nice bokeh effect with accurate depth estimation
  • Nice color and skin tones in video
  • High levels of detail in video

Cons

  • High levels of image noise
  • Slightly limited depth of field
  • Ghosting, halo and ringing artifacts
  • Occasional autofocus instabilities in video
  • Camera shake while walking when recording video

The Samsung Galaxy S22 Ultra (Exynos) is among the best phones we have tested for selfie shooting and improves slightly over its predecessor S21 Ultra 5G (Exynos). Improvements are most noticeable in terms of depth of field and texture as well as video. On the downside, noise levels are higher than on the previous model and video stabilization is less effective.

This outdoor selfie shows accurate white balance and exposure, as well as a good level of detail.

When shooting still images, the camera handles exposure well, capturing good target exposure on portraits and a wide dynamic range in high-contrast scenes. Colors are nice, with neutral white balance and natural skin tones. However, noise levels are high in all conditions, especially in brighter outdoor light where we can see luminance noise on backgrounds and in shadow areas. The autofocus is quite fast and repeatable, but depth of field is a little limited, which can result in blurry background subjects in group selfies. Our testers also observed some image artifacts, including ghosting, halos, and ringing.

In video mode, exposure and color are managed just as well as for stills, but we did see some autofocus instabilities and depth of field is just as limited as in photo mode. Textures are rendered nicely, with a high level of detail, but the camera’s main drawback in video mode is the ineffective video stabilization when walking while recording. In this type of situation, a lot of camera shake will be noticeable in the video footage.

[glossary_exclude]Samsung Galaxy S22 Ultra (Exynos) Selfie Scores vs Ultra-Premium[/glossary_exclude]
This graph compares overall photo and video DXOMARK Selfie scores between tested devices and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices.

Test summary

About DXOMARK Selfie tests: For scoring and analysis, DXOMARK engineers capture and evaluate more than 1,500 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the front camera’s default settings. The photo protocol is designed to take into account the user’s needs and is based on typical shooting scenarios, such as close-up and group selfies. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K. For more information about the DXOMARK Selfie test protocol, click here. More details on how we score smartphone cameras are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses .Full performance evaluations are available upon request. Please contact us on how to receive a full report.

[glossary_exclude]Photo[/glossary_exclude]

130

Samsung Galaxy S22 Ultra (Exynos)

147

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]
[glossary_exclude]Samsung Galaxy S22 Ultra (Exynos) Photo scores vs Ultra-Premium[/glossary_exclude]
The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Range of focus and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera's main strengths and weaknesses.

[glossary_exclude]Exposure[/glossary_exclude]

88

Samsung Galaxy S22 Ultra (Exynos)

92

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

92

Samsung Galaxy S22 Ultra (Exynos)

105

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

Exposure and color are the key attributes for technically good pictures. For exposure, the main attribute evaluated is the brightness of the face(s) in various use cases and light conditions. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera's ability to provide the same rendering when shooting consecutive images in a row.
For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability.

In this difficult backlit scene, the S22 Ultra manages good exposure and a nice contrast on both faces. The camera also does a good job at retaining detail in the brighter background, although some clipping is noticeable. There is slightly better detail in the background of the S21 Ultra image but face exposure is brighter on the S22 Ultra. Compared to the iPhone the S22 Ultra shows better highlight retention and contrast.

Samsung Galaxy S22 Ultra (Exynos), good exposure on both faces, good highlight retention in background
Samsung Galaxy S21 Ultra 5G (Exynos), good exposure on lighter skin tone, slight underexposure on darker skin tone, good highlight retention in background
Apple iPhone 13 Pro Max, good exposure on lighter skin tone, slight underexposure on darker skin tone, slight overexposure on background

In this close-up, the two Samsung devices deliver very similar exposure. Compared to the iPhone there clipping in the background is less strong and contrast is better. Face exposure is a touch darker, though.

Samsung Galaxy S22 Ultra (Exynos), good exposure on face, slightly overexposed sky
Samsung Galaxy S21 Ultra 5G (Exynos), good exposure on face, slightly overexposed sky
Apple iPhone 13 Pro Max, good exposure on face, overexposed sky

In this graph, you can see that the S22 Ultra delivers good exposure in all conditions. Overall exposure is better than on the comparison devices.

This graph shows lightness measured on the 18% gray patch of the Colorchecker® chart against the light level (in lux). The white area represents the region where the lightness is considered correct.

In this scene, both Samsungs produce quite neutral white balance and nice skin tones. The red wall in the background is more saturated on the iPhone but skin tones are too red as well.

Samsung Galaxy S22 Ultra (Exynos), neutral white balance, accurate color rendering
Samsung Galaxy S21 Ultra 5G (Exynos), neutral white balance, accurate color rendering
Apple iPhone 13 Pro Max, warmer white balance, slightly reddish skin tones

Under indoor lighting, the white balance is more neutral on the iPhone but skin tones still are too red. Both Samsung phones produce again very similar color, with a cooler color cast and more neutral skin tones.

Samsung Galaxy S22 Ultra (Exynos), natural skin tones, neutral white balance
Samsung Galaxy S21 Ultra 5G (Exynos), natural skin tones, neutral white balance
Apple iPhone 13 Pro Max, warmer white balance, slightly reddish skin tones

[glossary_exclude]Focus[/glossary_exclude]

95

Samsung Galaxy S22 Ultra (Exynos)

100

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Autofocus tests evaluate the accuracy of the focus on the subject’s face, the repeatability of an accurate focus, and the depth of field. While a shallow depth of field can be pleasant for a single-subject selfie or close-up shot, it can be problematic in specific conditions such as group selfies; both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from 30cm to 150cm, and in low light to outdoor conditions.

All three comparison devices come with autofocus systems in the front camera, allowing them to focus correctly across varying subject distances. In the focus range graph below, we can see that all three devices deliver in-focus images across all tested subject distances (any acutance higher than 80% is considered in focus).

This graph shows acutance against shooting distance at a light level of 1000 lux.

The S22 Ultra’s depth of field is wider than the S21 Ultra’s but still somewhat limited. Background subjects in group shots tend to be slightly out of focus. In comparison, the background is slightly sharper on the iPhone 13 Pro Max.

Samsung Galaxy S22 Ultra (Exynos), depth of field
Samsung Galaxy S22 Ultra (Exynos), crop: accurate autofocus, wide depth of field
Samsung Galaxy S21 Ultra 5G (Exynos), depth of field
Samsung Galaxy S21 Ultra 5G (Exynos), crop: accurate autofocus, slightly limited depth of field
Apple iPhone 13 Pro Max, depth of field
Apple iPhone 13 Pro Max, crop: accurate autofocus, wide depth of field

[glossary_exclude]Texture[/glossary_exclude]

74

Samsung Galaxy S22 Ultra (Exynos)

79

[glossary_exclude]Asus ZenFone 7 Pro[/glossary_exclude]

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in facial features, such as the eyes. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

[glossary_exclude]Texture acutance evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of texture acutance with the level of lux for two holding conditions. The texture acutance is measured on the Dead Leaves chart in the Close-up Dead Leaves setup.

Our lab measurements show that texture acutance on the S22 Ultra is higher than on the comparison devices for most conditions, except very bright and very low light.

Under indoor lighting, the S22 Ultra produces better detail than the iPhone. Textures also look more natural than on the S21 Ultra.

Samsung Galaxy S22 Ultra (Exynos) , indoor texture
Samsung Galaxy S22 Ultra (Exynos) , crop: high levels of fine detail
Samsung Galaxy S21 Ultra 5G (Exynos), indoor texture
Samsung Galaxy S21 Ultra 5G (Exynos), crop: good fine detail
Apple iPhone 13 Pro Max, indoor texture
Apple iPhone 13 Pro Max, crop: good fine detail

[glossary_exclude]Noise[/glossary_exclude]

52

Samsung Galaxy S22 Ultra (Exynos)

94

[glossary_exclude]Huawei Mate 50 Pro[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, and structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, but also on dark areas and high dynamic range conditions. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the DXOMARK Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

When shooting outdoors, noise levels on the S22 Ultra are higher than on the comparison phones. Noise is most intrusive in the shadows and background.

Samsung Galaxy S22 Ultra (Exynos), outdoor noise
Samsung Galaxy S22 Ultra (Exynos), crop: high noise levels on background and shadows
Samsung Galaxy S21 Ultra 5G (Exynos), outdoor noise
Samsung Galaxy S21 Ultra 5G (Exynos), crop: slight luminance noise on background and shadows
Apple iPhone 13 Pro Max, outdoor noise
Apple iPhone 13 Pro Max, crop: hardly any luminance noise visible
Things change under indoor conditions and in low light where noise remains visible but is similar to the S21 Ultra and less intrusive than on the Apple phone.
Samsung Galaxy S22 Ultra (Exynos), indoor noise
Samsung Galaxy S22 Ultra (Exynos), crop: noise quite visible on face and walls
Samsung Galaxy S21 Ultra 5G (Exynos), indoor noise
Samsung Galaxy S21 Ultra 5G (Exynos), crop: noise quite visible on face and walls
Apple iPhone 13 Pro Max, indoor noise
Apple iPhone 13 Pro Max, crop: slight noise on face, strong noise on walls

[glossary_exclude]Artifacts[/glossary_exclude]

82

Samsung Galaxy S22 Ultra (Exynos)

89

[glossary_exclude]Google Pixel 7 Pro[/glossary_exclude]

The artifacts evaluation looks at lens shading, chromatic aberrations, distortion measurement on the Dot chart and MTF, and ringing measurements on the SFR chart in the lab. Particular attention is paid to ghosting, quantization, halos, and hue shifts on the face among others. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

[glossary_exclude]Main photo artifacts penalties[/glossary_exclude]

Ghosting artifacts can be noticeable when capturing high-contrast or low-light scenes. Our testers also observed some ringing.

Samsung Galaxy S22 Ultra (Exynos), artifacts
Samsung Galaxy S22 Ultra (Exynos), crop: ghosting on moving elements
Samsung Galaxy S22 Ultra (Exynos), artifacts
Samsung Galaxy S22 Ultra (Exynos), crop: ringing on high-contrast edges

[glossary_exclude]Bokeh[/glossary_exclude]

75

Samsung Galaxy S22 Ultra (Exynos)

80

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DSLR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

This difficult scene is handled quite well by the S22 Ultra’s bokeh mode. Correctly, no blur has been applied to the plant as it is in the same plane as the subject. Additionally, the simulated aperture makes for a nice overall effect. Some slight depth artifacts are visible. This is the same for the other devices which also apply some slight blur to the plant.

Samsung Galaxy S22 Ultra (Exynos), slight depth artifacts but natural blur and simulated aperture
Samsung Galaxy S21 Ultra 5G (Exynos), good depth estimation but some depth artifacts
Apple iPhone 13 Pro Max, good depth estimation but some depth artifacts

[glossary_exclude]Video[/glossary_exclude]

144

Samsung Galaxy S22 Ultra (Exynos)

154

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]
About DXOMARK Selfie Video tests

DXOMARK engineers capture and evaluate more than 2 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the front camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000+ lux and color temperatures from 2,300K to 6,500K.

[glossary_exclude]Samsung Galaxy S22 Ultra (Exynos) Video scores vs Ultra-Premium[/glossary_exclude]
Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, smoothness, and stability of exposure, white balance, and autofocus transitions.

[glossary_exclude]Exposure[/glossary_exclude]

81

Samsung Galaxy S22 Ultra (Exynos)

86

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

[glossary_exclude]Color[/glossary_exclude]

84

Samsung Galaxy S22 Ultra (Exynos)

90

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Exposure tests evaluate the brightness of the face and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed. Image-quality color analysis looks at skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

In this video scene, the S22 Ultra produces accurate exposure with a good compromise between different skin tones and good detail retention in the bright background. The iPhone’s dynamic range is wider but the exposure is quite dark.

Samsung Galaxy S22 Ultra (Exynos), well-balanced exposure, good highlight retention in background

Apple iPhone 13, well-balanced exposure, good highlight retention in background

Apple iPhone 13 Pro Max, wider dynamic range and better highlight retention but underexposure on faces

Target exposure is slightly better than the comparison devices in low light and quite similar to the S21 Ultra in bright conditions (slightly over target but still acceptable).

This graph shows lightness measured on the 18% gray patch of the Colorchecker® chart against the light level (in lux). The white area represents the region where the lightness is considered correct.

Video white balance on the S22 Ultra is colder than on the iPhone, but skin tones look more natural than on the predecessor S21 Ultra.

Samsung Galaxy S22 Ultra (Exynos), Natural skin tones, neutral white balance
Samsung Galaxy S21 Ultra 5G (Exynos), slightly less saturated skin tones, warm white balance
Apple iPhone 13 Pro Max, warmer white balance, nice skin tones

[glossary_exclude]Texture[/glossary_exclude]

88

Samsung Galaxy S22 Ultra (Exynos)

97

[glossary_exclude]Asus ZenFone 6[/glossary_exclude]

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural video recordings are visually evaluated, with particular attention paid to the level of detail on the facial features. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The chart used is the Dead Leaves chart.

Measured video texture on the S22 Ultra is similar to S21 Ultra but noticeably higher than on the iPhone 13 Pro Max.

This graph shows texture and edge acutance against the light level (in lux). Texture and edge acutance are measured on the deadleaves chart in the video deadleaves setup.

Level of texture is higher for the S22 than the S21 and similar to the iphone.

Samsung Galaxy S22 Ultra (Exynos), very good detail

Samsung Galaxy S21 Ultra 5G (Exynos), good detail

Apple iPhone 13 Pro Max, very good detail

[glossary_exclude]Noise[/glossary_exclude]

64

Samsung Galaxy S22 Ultra (Exynos)

83

[glossary_exclude]Xiaomi Mi 11 Ultra[/glossary_exclude]

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise on faces. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

[glossary_exclude]Spatial visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.
[glossary_exclude]Temporal visual noise evolution with the illuminance level[/glossary_exclude]
This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

Video noise is well under control under indoor conditions and under low light. Noise levels are lower than on the competitors.

Samsung Galaxy S22 Ultra (Exynos), noise well under control

Samsung Galaxy S21 Ultra 5G (Exynos), more noise and underexposed

Apple iPhone 13 Pro Max, strong noise

The results we can see in the clips above are confirmed by our lab measurements. Noise levels on the S22 Ultra in the lab are lower than on the iPhone 13 Pro Max.

This graph shows temporal visual noise and temporal noise chromaticity ratio against the light level (in lux). Temporal visual noise and noise chromaticity ratio are measured on the visual noise chart in the video noise setup.

[glossary_exclude]Stabilization[/glossary_exclude]

74

Samsung Galaxy S22 Ultra (Exynos)

82

[glossary_exclude]Apple iPhone 14 Pro[/glossary_exclude]

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at overall residual motion on the face and the background, smoothness and jello artifacts, during walk and paning use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

While stabilization works well when holding the camera still, it is not very effective when moving during recording. A lot of residual camera motion is still noticeable in the footage. The S22 Ultra is worse than both comparison devices in this respect.

Samsung Galaxy S22 Ultra (Exynos), ineffective stabilization when moving

Samsung Galaxy S21 Ultra 5G (Exynos), better stabilization

Apple iPhone 13 Pro Max, better stabilization

[glossary_exclude]Artifacts[/glossary_exclude]

88

Samsung Galaxy S22 Ultra (Exynos)

92

[glossary_exclude]Apple iPhone 12 mini[/glossary_exclude]

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as quantization, hue shift, and face-rendering artifacts among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below

[glossary_exclude]Main video artifacts penalties[/glossary_exclude]

Anamorphosis (perspective distortion) can sometimes be noticeable on faces close to the edge of the frame. In this clip, this is especially noticeable on the face on the right.

Samsung Galaxy S22 Ultra (Exynos), anamorphosis at the edge of the frame

The post Samsung Galaxy S22 Ultra (Exynos) Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/samsung-galaxy-s22-ultra-exynos-selfie-test-retested/feed/ 0 SELFIE SELFIE ColoredPannels_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 BacklitDuofieFairDeep_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 BacklitDuofieFairDeep_SamsungGalaxyS21Ultra5G_DxOMark_Selfie_06-00 BacklitDuofieFairDeep_AppleiPhone13ProMax_DxOMark_Selfie_05-00 BridgeHDR_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 BridgeHDR_SamsungGalaxyS21Ultra5G_DxOMark_Selfie_05-00 BridgeHDR_AppleiPhone13ProMax_DxOMark_Selfie_05-00 graph_exposure Forbidden_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 Forbidden_SamsungGalaxyS21Ultra5G_DxOMark_Selfie_05-00 Forbidden_AppleiPhone13ProMax_DxOMark_Selfie_05-00 LightOn_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 LightOn_SamsungGalaxyS21Ultra5G_DxOMark_Selfie_05-00 LightOn_AppleiPhone13ProMax_DxOMark_Selfie_05-00 graph_focusrange CafetShrub_SamsungGalaxyS22Ultra_Exynos_DxOMark_Selfie_05-00 CafetShrub_SamsungGalaxyS21Ultra5G_DxOMark_Selfie_05-00 CafetShrub_AppleiPhone13ProMax_DxOMark_Selfie_05-00 graph_videoexposure StaticIndoor SamsungGalaxyS22Ultra_Exynos DxOMark_Selfie-00001 StaticIndoor SamsungGalaxyS21Ultra5G DxOMark_Selfie-00002 StaticIndoor_AppleiPhone13ProMax_DxOMark_Selfie (1).mov-00001 graph_videotexture graph_videonoise
Oppo Find X5 Pro Selfie test https://www.dxomark.com/oppo-find-x5-pro-selfie-test/ https://www.dxomark.com/oppo-find-x5-pro-selfie-test/#respond Thu, 08 Sep 2022 12:31:26 +0000 https://www.dxomark.com/?p=110875 The Oppo Find X5 Pro competes with flagship devices from Samsung, Huawei and Apple in the Ultra Premium segment, and this is reflected in the hardware specifications. With a Qualcomm Snapdragon 8 Gen 1 chipset, a 6.70-inch AMOLED display with QHD+ resolution, and triple rear camera with ultra-wide and tele, the new Oppo features top-end [...]

The post Oppo Find X5 Pro Selfie test appeared first on DXOMARK.

]]>
The Oppo Find X5 Pro competes with flagship devices from Samsung, Huawei and Apple in the Ultra Premium segment, and this is reflected in the hardware specifications. With a Qualcomm Snapdragon 8 Gen 1 chipset, a 6.70-inch AMOLED display with QHD+ resolution, and triple rear camera with ultra-wide and tele, the new Oppo features top-end components in many areas.

Things look a little simpler for the front camera which uses a fixed focus lens to channel light onto a 32MP sensor. In video mode, you can record Selfie footage at 1080p resolution and 30 frames per second. Read on to find out how the X5 Pro front camera performed in the DXOMARK Selfie test.

Key front camera specifications:

  • 32MP sensor
  • f/2.4-aperture lens
  • 90° field of view
  • Fixed focus
  • 1080p/30fps video

About DXOMARK Selfie tests: For scoring and analysis in our smartphone front camera reviews, DXOMARK engineers capture and evaluate over 1500 test images and more than 2 hours of video both in controlled lab environments and in natural indoor and outdoor scenes, using the camera’s default settings. This article is designed to highlight the most important results of our testing. For more information about the DXOMARK Selfie test protocol, click here. 

Test summary

Scoring

Sub-scores and attributes included in the calculations of the global score.


Oppo Find X5 Pro
116
selfie
119
photo
81

92

73

105

80

100

65

79

80

94

78

89

78

93

55

80

113
video
74

86

76

90

89

92

69

97

61

83

74

92

38

82

Please be aware that beyond this point, we have not modified the initial test results. While data and products remain fully comparable, you might encounter mentions and references to the previous scores.

Pros

  • Good detail at close range
  • Pretty wide dynamic range
  • Low levels of noise
  • Good focus, decent detail, and nice and vivid color in video
  • Low noise in bright light and indoor video

Cons

  • Exposure instabilities
  • White balance casts, especially in bright light
  • Limited depth of field
  • Image artifacts, including color quantization and unnatural skin rendering
  • Ineffective video stabilization
  • Exposure and white balance instabilities in video
  • Noise in low-light video
  • Hue shift effect close to clipped skin tone areas

With a DXOMARK Selfie score of 89, the Oppo Find X5 Pro is behind the best front cameras in the Ultra Premium segment and delivers a performance that is quite close to previous Oppo devices in lower price segments, such as the Reno6 Pro.

This image demonstrates a wide dynamic range and good exposure but also shows some inconsistent color rendering and halo artifacts.

When shooting still selfies, the camera usually gets the target exposure right and captures a wide dynamic range. However, our testers observed some exposure instabilities. We also saw frequent white balance casts on all types of skin tones. Detail is high at close range in most conditions, but a limited dynamic range means subjects behind the plane of focus are rendered soft. Noise is well under control in most situations, but some image artifacts, such as color quantization or unnatural skin rendering, can be visible.

Noise comparison: Noise is well controlled on the X5 Pro in bright light and indoor conditions.

In video mode, the Oppo Find X5 Pro generally records well-exposed footage. Color is acceptable, and detail is fairly decent, given the Oppo records at 1080p resolution versus 4K on most direct competitors. On the downside, exposure and white balance instabilities are often noticeable in all light conditions. In addition, very poor video stabilization means that videos recorded while in motion are almost unusable.

The post Oppo Find X5 Pro Selfie test appeared first on DXOMARK.

]]>
https://www.dxomark.com/oppo-find-x5-pro-selfie-test/feed/ 0 SELFIE SELFIE 05_OppoFindX5Pro_SelfieEdito-NOEXIF graph_noise