Facial-recognition software might have a racial bias problem http://www.nextgov.com/big-data/201...ftware-might-have-racial-bias-problem/127327/
Time to whip out my mirror glasses and sombrero. Of course, apart from the varying levels of recognition accuracy by race and demographic, the overwhelming problem is that of false positives and absence of redress. As with any biometric, there's a difficult problem of avoidance or repudiation. And you'll get the usual pseudo-junk-science from the lavishly paid forensic "science" companies wanting to deploy this stuff everywhere, and making grand sales pitches to LE.
Ignoring the technology capabilities. You also need to consider racial bias based on the pool of criminals that they use to populate will be disproportionately minorites (based on the prison population). This will further increase the degree in which minorities are watched during there daily activities and further skew the pool of applicants used to populate the database.
I can not wait, until it is implemented into payment systems, no need to use a credit card or a sticky label, to pay with, just smile at the camera.
For a little bit of fun, see this for face recognition self-defence: https://www.theguardian.com/technol...illance-clothing-facial-recognition-hyperface
Why not just use camouflage based on collages of actual (or at least, more life-like) faces? That would be harder to train against, I think.
I think the idea is swamping the records with chaff, so it's harder for them to set appropriate thresholds. Maybe a whole range of face-like things in varying degrees from cartoon to uncanny valley would be the thing. Add in some orang-utans and chimpanzees for fun. Adding an fake eye or two to one's own physog. plus some make up might complete the picture, along with a sombrero and mirror sunglasses.