In a bizarre experiment, the US human rights watchdog American Civil Liberties Union (ACLU) built a dataset from 25.000 publicly available arrest mughsots and then matched these with the pictures of congressmen. And they were shocked to see that the ‘Rekognition’ app falsely identified 28 of them as being those on the arrest mugshots.
The ACLU, which runs on an annual sum of around one million dollars according to its official statements, made the experiment using an out-of-the-box version of the ‘Rekognition’ software with default settings. The costs involved were $12.33, according to the brief story they published on their site. ACLU points out to the fact, that people of color were more likely (almost double the rate) identified falsely than their white-skin counterparts. All being said, their standpoint is that using such technologies might easily lead to heavy-handed responses by the law enforcement if someone is falsely recognized by such facial recognition systems before the law enforcement acts.
The technology delivered by Amazon is – in their own words – “is a deep learning powered image recognition service that detects objects, scenes, and faces; extracts text; recognizes celebrities; and identifies inappropriate content in images. It also allows you to search and compare faces”. So, it is a scalable application for a wide range of visual recognition tasks, priced on a per-use basis (with some free features even). The most this partiular app could provide is to ID and track persons in a video or image, identify the background scenery (possibly even locate it aswell), texts, objects and whatnot. It is designed to be an integrated part of the surveillance system of the buyer, and to this end it is largely optimized to be programmable trough an application programming interface (API).
Yet it is a technology which is freely available in its basic form, but which has versions and forks that are only accessible for governmental (which is an euphemism standing for police, intelligence and military) customers.
Putting stuff into a broader context, we are talking about two mammoth players battling each other here. One is Amazon, which is the keeper of the Amazon SageMaker benchmark machine learning and AI-feeder platform that is closely linked with most TensorFlow (itself a brainchild of Google) projects. And a player who made astronomical sums on deploying AI to online projects.
The other is ACLU, which is the key player in the USA when it comes to legal battle against others in the name of social justice. Their history is dotted with large-scale victories spanning more than seven decades. It is quite clear that they won’t back down anytime soon, because they are fairly accustomed to protracted legal processes. And a mini-storm is already brewing: according to news pieces “Nearly 20 groups of Amazon shareholders pressuring the firm in letter to Jeff Bezos, the CEO of Amazon, demanding the company stop selling ‘Rekognition’ real-time face tracking software to police and law enforcement”.
So, a battleground is set for combat.
There are at least two significant issues here for us, who aren’t associated with any side.
First, one has to understand that facial recognition softwares are flawed, even if it has been made by the most advanced party in the field. This means that we could at one hand be identified falsely and at the other hand we could possibly fool these systems to erroneously identify us (or to be unable to match us against a dataset). So, it gives a headway for those with a dislike of public cameras, but at the same time it makes a proportional problem for some governmental organizations, because they already purchased these faulty systems and will most likely stick to their use anyways. Even if sometimes the wrong guy will be caught and sentenced.
The second thing to consider is how far companies went to be able to ID a human. Our Human Weaknesses are certainly exploited big time, this time the unalterability of our physical appearance, like our body mass and facial expressions. While this particular technology so far is only used by the regular police forces in Washington County, Oregon and Orlando, Florida, it is known that a myriad of competing products are vying for a market share. And in an urban setting chances are that you are being identified by optical recognition systems at least 20 times a day. And who knows what they think of you…?
Sources (for the site’s Sufficient Source Policy, read HERE):