You might not even have noticed but, with a high probability, your phone does not display the same image in the preview (before taking it) as in the gallery. This is because most of the image processing in Android is done after the photo is taken, which means that until the photo is already processed in the gallery, we cannot see how it really is. Google put an end to this problem last year with its Live HDR, a function that allows the photo to be very precise before taking it.
The processing of HDR is very complex, in fact if you look closely it is very likely that you have a phone that in the preview it shows the sky and the burnt items which is solved later when you have the final photo. Google wanted to explain how HDR works in real time in preview, a rather curious exercise in computer photography.
This is how the “live” Google Pixel 4 HDR works
Most phones don’t display real-time HDR like the Google Pixel 3 does.
Taking advantage of the launch of the recent Google Pixel 4a, Google wanted to explain how its live HDR works. They say that until recently could not calculate HDR in real time, which prevented him from being seen in the viewfinder. With the Pixel 4 this was achieved and this technology was also applied to the Pixel 4a. This is Google’s Live HDR + technology, which offers an almost identical approach in the preview to that of the final photo.
Google’s HDR + can combine up to 15 photos, each with a different tone mapping
Processing HDR is not easy. Google explains that when we press the shutter on the Pixel, 3 to 15 images with different exposures are captured. The magic of Google HDR is that its algorithm applies a different tone curve to each frame, allowing you to reassign the pixels of each frame into the final image. So that you understand it easily, Google HDR takes values from different photos to combine them into a final photo with maximum dynamic range. The high and low light data is processed, the pixels of the different images are combined and the end result is obtained.
Each image requires a different form of tone mapping
The problem here, as Google explains, is that HDR + is a slow process so it’s not easy to preview this. It is difficult to process the tone curve of each of the images the phone will take in real time, so Google’s solution for Live HDR + is carry out this process on a small scale.
Google’s solution to dealing with such complex HDR in real time is to apply said processing to small parts of the image, to free the GPU from the workload.
So, Google does the same process we described, but instead of applying it to the whole scene, select a small part, a division into mosaics. With this, the phone has an idea of what processing it will do, but only having to work with a small part of the photo, it is able to preview this result.
This treated This is mainly done through the GPU, and to compute the curves locally, a neural network called HDRNet is used, which allows low-resolution images to be used to predict the curves in a high-resolution viewer (the preview we see on the screen).
The icing on the cake, a feature that only Google has been able to implement to date: double exposure HDR. The Pixel 4 is not only capable of displaying HDR in the preview, but allow real-time adjustment. It is not only the classic button to increase or decrease the brightness of the picture, but they are able to work independently with reflections and reflections in the image, by means of two small independent controls.
Example of the different looks that can be achieved by playing with double exposure.
Source : Frandroid