The camera on Google Pixel 2 works with a key ingredient called the Visual Core, which makes it as good as it is. The Visual Core was announced in October 2017, but it was previously gated off to developers in the the Android Oreo 8.1. But Google in its February 2018 security patch rolled out the support for Visual Core for third party apps like Instagram, WhatsApp and Snapchat.
The Visual Core is a custom physical chip (Image Processing Unit) with eight cores dedicated to HDR+ image processing, which uses machine learning to enhance your photos without depending on the SoC.
Until now, the camera didn’t fully use its computational photography skills because the phones were using the resident Snapdragon 835’s IPU for image processing.
But that’s no longer the case after Google rolled out the HDR+ support for third party apps, thanks to the combination of Neural Networking (AI) APIs on Android 8.1 and Camera2 API added in Android 5.0 Lollipop.
We tested the Pixel phones with and without the Visual Core chipset.
We compared samples from Pixel 2 XL running Visual Core update against the one without it, and found out something really interesting.
Instagram tells a different story
We started with Instagram.
Google has claimed that “the co-processor takes the detail and balance of contrast to a whole new level”, which we found to be true to an extent.
The picture below was taken indoors in artificial light, and the HDR+ did a good job of giving more definition in details. Also, the exposure looks more even, but the colour, when matched, was closer to the source in the image clicked on the Pixel 2 without the HDR+.
In the picture above, Instagram camera has tried to balance the exposure but the image looks washed out when compared to the …read more