Google launched the Arts and Culture App as a means to connect people with a vast amount of art work:

The Google Arts & Culture platform hosts millions of artifacts and pieces of art, ranging from prehistory to the contemporary, shared by museums across the world. But the prospect of exploring all that art can be daunting. To make it easier, we dreamt up a fun solution: connect people to art by way of a fundamental artistic pursuit, the search for the self … or, in this case, the selfie.

The idea is you take a selfie with the app, and let their experimental algorithm use facial recognition technology to try and match it to art in the collection.

Rather fun right? Perhaps you already tried it?

But as savvy digital alchemists who looked at data tracking, hopefully you night be curious about what happens to your face when its image goes to Google? They do say

(By the way, Google doesn’t use your selfie for anything else and only keeps it for the time it takes to search for matches.)

And it does come up before you use the app:

Do we know that? Hmmmm.

But there’s more to worry about. Many have noticed that the results of the match are much less successful if you don’t have a white face, suggesting there is a racial bias in this app. And it’s more than an art app, facial recognition has much wider uses, and researchers have shown accuracy is much higher if you are whiter.

We will run our own experiments with the app, which you can get for Android devices and for Apple devices.

By no means are you required to post your own face on line to test it, you can flip the camera around to forward facing, and try using images of faces. Or even dogs.

No facial match for dogs, sorry Felix.


Here I tried using a framed photo of my Mom, who is deceased. Sorry Mom, they made you Pierre.



One of my childhood photos got mapped to a girl.

You may want to think about what is being matched here. And the implications for cameras in public places to do this. Give the app a try on your own face (only if you are willing to) or on some other imagery. Beyond whether the match is “good” or “bad” how do the results show bias in the algorithm or gaps in the data? Beyond the “neatness” of this, what does it mean to match faces?

Respond to This Make

After you respond to this make please share a link to it and a description so it can be added to the ones listed below. You can add it directly to this site.

Add A Response

Guides for this Make

Have you created a helpful guide or do you know of one that might help others with this make? You can share a guide if it is available at a public URL.

Add a Guide

0 Responses for this Make

2 Guides for this Make

Creative Commons License
This work by Alan Levine is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.