The two main representatives of each operating system present very specific techniques for offering selfies in Portrait Mode. Pixel 2 and iPhone X have different solutions to bring the best self-portrait with the blurred background that a 2017 flagship can offer.
Of course, the Samsung’s Galaxy S8 and Note 8, as well as other smartphones, also have a solution of their own, but the devices made by companies that develop leading mobile operating systems appeal to enthusiasts. At least that's how TechLector thinks, which made a comparison of these two awesome smartphones.
Apple has opted for a hardware-based system with a camera that brings True Depth across multiple sensors and real-time. Google has tried to walk the software path, with learning the machine to apply the effect. Here's the difference from one to the other:
You may have noticed right away that the iPhone X managed to make the object of interest lighter, but sacrificed some of the backgrounds for it, bursting a bit of light. Pixel 2 has managed to work with the dynamic range to bring a more balanced photo into the light.
However, the Google device had a darker overall result, and especially in the object of interest, which could have received more illumination. They are two photos with their positives and negatives.
As for the blurring effect of the background itself, Pixel 2 was more aggressive, which was cool, but if you notice, the person in the photo looks cropped, detached from the image location, as if it had been edited there. What is not so apparent in the iPhone X, but the Apple device has blurred areas that belong to the person in the photo, confusing it with the background.
In the end, neither brings a result that hits the eye and technically is still a long way off. However, we must remember that these are new techniques and that each company opted for a different way to get these photos.
In this sense, the Android Police concludes that each one must begin to walk towards the other to improve their selfies in Portrait Mode. That is, Google will start applying hardware solutions to aid their machine learning, and Apple must include software solutions to their hardware.
Another exciting thing to mention is that Apple allows the user to calibrate the blur effect both during capture and after. But only one file is saved, and editing is always done in one photo, which makes it difficult to analyze different possibilities.
Google allows the Pixel to save two files: one with the effect and another without. So the user can compare both and choose which of the two will play well on social networks. Or edit it separately to see if it finds a more satisfactory result.
But, what image did you think was the best? A from Pixel 2 or iPhone X? Tell us in the comment field.