It’s been more than three years since Apple released P3 colour space with a release of iPhone 7. The P3 colour space is equipped with a vast range of colours which is 25% as compared to the sRGB colour space. It is standard in Google Pixel smartphones.
So, what is the difference? Based on a report from XDA-Developers, P3 colours space enables iPhones to capture a large number of colours as well as the depth to the images. Android phones, on the other hand, are supported by sRGB colour range. Thus, iPhones display images wherein colours are brighter whereas Google Pixel devices do capture photos with the same details. This occurs because the color management in the Android apps is not enough. But things are going to be changed with the arrival of Google Pixel 4 this year in October. Google announced back in May of this year that wide-ranging colour images would be available for Android OS. However, people at XDA developers have now found a code in the Google Camera app to capture P3 color gamut. When they tested that function, they come to know that it can easily capture images outside the sRGB colour space. The difference as mentioned in the publication is subtle, but the color support of P3 enables the smartphone to capture pictures nearer to objects of real life. Besides, Google Camera, Google Photos also implement colour management in its platform. As the technology is getting advanced more and more day by day, it seems that in its Pixel 4 smartphones Google plans to develop P3 color assistance. We must, however, wait until Google reveals more details.