- Wu and Zhang get their own result exactly wrong when they write,
“Unlike a human examiner/judge, a computer vision algorithm or classifier has absolutely no subjective baggages, having no emotions, no biases whatsoever due to past experience, race, religion, political doctrine, gender, age, etc., no mental fatigue, no preconditioning of a bad sleep or meal. The automated inference on criminality eliminates the variable of meta-accuracy (the competence of the human judge/examiner) all together.”
This kind of rhetoric advocates for replacing biased human judgment with a machine learning technique that embeds the same bias — and more reliably. Worse, however, it argues that introducing machine learning into an environment where it can augment or scale up human judgment of criminality can help to make things fairer. In fact it will do the opposite, because humans will assume that the machine’s “judgment” is not only consistently fair on average but independent of their personal biases. They will thus read agreement of its conclusions with their intuition as independent corroboration. Over time it will train human judges who use it to gain confidence in their ability to recognize criminality in the same manner.
Our existing implicit biases will be legitimized, normalized, and amplified. We can even imagine a runaway effect if subsequent versions of the machine learning algorithm are trained with criminal convictions in which the algorithm itself played a causal role.
What's the worst that could happen? One of the Italian Renaissance masters said that to truly capture someone's personality you must illustrate them as they're about to speak. I've used that to good effect in portraiture. At the same time, if you told me that what they were about to say mattered more than anything else I'd believe you. The Canon video is worth watching, if you haven't seen it. The Faception team are not shy about promoting applications of their technology, offering specialized engines for recognizing “High IQ”, “White-Collar Offender”, “Pedophile”, and “Terrorist” from a face image. [16] Their main clients are in homeland security and public safety. Faception is betting that once again governments will be keen to “judge a book by its cover”.
Oh wow, that's pretty amazing. Come to think of it, I often see news media do this with the photos they have to accompany an interview. It's almost always something that reinforces or emphasizes the impression people have of whomever the article is about. Two years ago we did a family shoot. I did wonder why the photographer was talking more than he was shooting...it makes sense now.
My wedding spiel was pretty much "I'm going to be shooting a lot of this from a ladder because everybody looks better slightly from above. I'm also going to be yammering like a moron and trying to crack you up because you will never look as great as you do when you're laughing. By all means talk back to me because you don't always look like a moron with your mouth open; on the other hand, from this point forth know that the worst thing you can do when there's a camera pointed at you is eat so you can rest assured I'm going to be guns down during the meal. There will be plenty of shots where you don't see me; I've learned that treating these things like a wildlife photographer on a long lens makes people less nervous because none of you believe your friends and family when they tell you you look spectacular but sometimes I'm gonna be up in your face and dramatic because the person on the other side of the photo album wants to see your pretty face reacting to them. Just remember - I get paid for making you look good, not making you look bad, and I like to think I've earned that money so relax, have fun and remember this isn't about me."
A podcast episode on a related topic is Cathy O'Neil on Weapons of Math Destruction on Econtalk Includes discussion of the recidivism rate - the projected likelyhood of returning to prison - how the use of zip codes is a proxy for race and how recidivism projections punish most those who would have the hardest time reintegrating into society even after a short sentence. They also note how the misuse of data can cause self-reinforcing effects, and increasing damage in the futureCathy O'Neil, data scientist and author of Weapons of Math Destruction talks with EconTalk host Russ Roberts about the ideas in her book. O'Neil argues that the commercial application of big data often harms individuals in unknown ways. She argues that the poor are particularly vulnerable to exploitation. Examples discussed include prison sentencing, college rankings, evaluations of teachers, and targeted advertising. O'Neil argues for more transparency and ethical standards when using data.
Just this Friday I was considering buying her book as my next read, but I ended up going for a fantastic book on LIGO and gravitational waves instead. It's only 6 hours on Audible so when I finish my current books I'll give it a listen.