With Real Tone, Pixel 6 aims to improve your portraits, whatever your skin tone
Santa

With Real Tone, Pixel 6 aims to improve your portraits, whatever your skin tone

It makes sense that phone manufacturers are paying extra attention to how faces show up in photos, and the new Pixel 6, announced by Google today, introduces a suite of new AI-powered tools to make humans show up better than ever. The two highlights are Face Unblur — which helps reduce blur on moving faces — and Real Tone. The latter is some AI-powered post-processing magic – powered by Google’s brand new Tensor chip –  aiming to make faces with all skin tones show up as well as possible.

Whether you’re taking selfies or someone-elsies, the vast majority of photos taken with a smartphone are of human beings. Traditionally, it has been extremely hard to get the exposure to look good for photos where multiple faces appear in the photo — especially if the faces all have different skin tones. The new Pixel 6 brings a layer of computational photography to the mix to ensure that everyone who appears in the photo looks as good as they can. The Pixel team worked with a diverse set of expert image-makers and photographers to tune the white balance, exposure and algorithms. They claim that this ensures that the photos work for everyone, of every skin tone.

Google highlights that it sees Real Tone as a mission and an improvement on its camera systems, rather than a conclusive solution to the challenges facing photographers. The company has invested substantial resources into ensuring that all people — and particularly people of color — are better represented in the way cameras capture their faces.

“My mother is a dark-skinned Black woman, my father is a white German. My whole life there’s been a question: How do we get one picture where everyone looks good,” said Florian Koenigsberger, Advanced Photography product marketing manager for the Android team, in a briefing interview ahead of the release of the new phones. “The new camera is a step along the journey. Google’s diversity numbers are not a mystery to the world, and we knew we definitely had some shortcomings in terms of lived experience and who could speak authentically to this.”

The camera team worked with photographers, colorists, cinematographers, direc1tors of photography and directors to get a deeper understanding of the challenges in lighting and capturing a diverse set of skin tones — and in particular people with darker skin tones. Among others, the team leaned on the experience from a broad spectrum of professionals, including Insecure’s director of photography Ava Berkofsky, photographer Joshua Kissi, and cinematographer Kira Kelly.

“We focused on bringing this really diverse set of perspectives, not just in terms of ethnicity and skin tones, but also a variety of practices,” said Koenigsberger. “The colorists were actually some of the most interesting people to talk to because they think of this as a science that happens in the process of creating images.”

The Google product team worked with these imaging experts to give them cameras and challenged them to shoot extremely challenging imaging situations, including mixed light sources, back-lighting, interiors, multiple skin tones in one image, etc.

“We had to learn where things fall apart, especially for these communities, and from there we can figure out what direction we can take from there,” Koenigsberger explains. “The imaging professionals were very frank, and they were directly in the room with our engineers. I helped facilitate these conversations, and it was fascinating to see not just the technical learnings, but also the cultural learning that happened in this space. I am talking about ashiness, darker skin tones, textures. The nuances for mid-tones can vary.”

The process starts with the camera’s facial detection algorithms. Once the camera knows it is looking at a face, it can start figuring out how to render the image in a way that works well. In testing across devices, Google’s team found that the Pixel 6 consistently performed better than those from competing manufacturers, and even the older-generation Pixel phones.

It isn’t immediately clear how the feature works in practice, and whether it does global edits (i.e. applies the same filter across the entire image), or whether the AI edits individual faces as part of its editing pass. We are hoping to take a deeper look at this specific aspect of the camera functionality to see how it works in practice very soon.

The camera team highlights that the work done in this space means that the training sets for creating the camera algorithms are more diverse by a factor of 25. The Real Tone feature is a core part of the camera algorithms, and it cannot be turned off or disabled.