During today’s Google I/O Keynote livestream, the company announced some of the changes coming to Google Maps, Google Photos, and Wear OS. In the camera, department, Google announced that it is working to improve its Pixel cameras to be more inclusive of darker skin colors by making them more color accurate and improve the exposure.
For people of color, photography has not always seen us as we want to be seen, even on some of our own Google products. To make smartphone photography truly for everyone, we’ve been working with a group of industry experts to build a more accurate and inclusive camera.” Sameer Samat, Vice President of Android and Google Play
Google’s presentation explained that it has worked with “expert image makers who have taken thousands of images to diversify” Google’s datasets. It’s also working to improve its accuracy for auto white balance and autoexposure algorithms. The goal, as Photographer and Director Micaiah Carter puts it, is to create “almost like a guidebook to captures skin tone.” This means being able to reduce stray light and bring out more natural brown tones, thus preventing over brightening and desaturation of darker skin tones.
Google is also working its algorithm to better capture curly and wavy hair more accurately. This means that cameras will be better equipped to separate any person’s hair in a photo from their background in a portrait photo.
Google says these new changes, along with more that haven’t been shown at Google I/O, will be coming to Pixel smartphones in the fall. In addition, Google wants to share its findings across the entire Android ecosystems so more OEMs can better represent people of all skin tones.