Google launched the Arts and Culture App as a means to connect people with a vast amount of art work:
The Google Arts & Culture platform hosts millions of artifacts and pieces of art, ranging from prehistory to the contemporary, shared by museums across the world. But the prospect of exploring all that art can be daunting. To make it easier, we dreamt up a fun solution: connect people to art by way of a fundamental artistic pursuit, the search for the self … or, in this case, the selfie.
The idea is you take a selfie with the app, and let their experimental algorithm use facial recognition technology to try and match it to art in the collection.
Rather fun right? Perhaps you already tried it?
(By the way, Google doesn’t use your selfie for anything else and only keeps it for the time it takes to search for matches.)
And it does come up before you use the app:
Do we know that? Hmmmm.
But there’s more to worry about. Many have noticed that the results of the match are much less successful if you don’t have a white face, suggesting there is a racial bias in this app. And it’s more than an art app, facial recognition has much wider uses, and researchers have shown accuracy is much higher if you are whiter.
By no means are you required to post your own face on line to test it, you can flip the camera around to forward facing, and try using images of faces. Or even dogs.
You may want to think about what is being matched here. And the implications for cameras in public places to do this. Give the app a try on your own face (only if you are willing to) or on some other imagery. Beyond whether the match is “good” or “bad” how do the results show bias in the algorithm or gaps in the data? Beyond the “neatness” of this, what does it mean to match faces?
Example for "Face Matching to Art Works":