Automatic Face Recognition and Surveillance

ID checks were a common response to the terrorist attacks of 9/11, but they’ll soon be obsolete. You won’t have to show your ID, because you’ll be identified automatically. A security camera will capture your face, and it’ll be matched with your name and a whole lot of other information besides. Welcome to the world of automatic facial recognition. Those who have access to databases of identified photos will have the power to identify us. Yes, it’ll enable some amazing personalized services; but it’ll also enable whole new levels of surveillance. The underlying technologies are being developed today, and there are currently no rules limiting their use.

Walk into a store, and the salesclerks will know your name. The store’s cameras and computers will have figured out your identity, and looked you up in both their store database and a commercial marketing database they’ve subscribed to. They’ll know your name, salary, interests, what sort of sales pitches you’re most vulnerable to, and how profitable a customer you are. Maybe they’ll have read a profile based on your tweets and know what sort of mood you’re in. Maybe they’ll know your political affiliation or sexual identity, both predictable by your social media activity. And they’re going to engage with you accordingly, perhaps by making sure you’re well taken care of or possibly by trying to make you so uncomfortable that you’ll leave.

Walk by a policeman, and she will know your name, address, criminal record, and with whom you routinely are seen. The potential for discrimination is enormous, especially in low-income communities where people are routinely harassed for things like unpaid parking tickets and other minor violations. And in a country where people are arrested for their political views, the use of this technology quickly turns into a nightmare scenario.

The critical technology here is computer face recognition. Traditionally it has been pretty poor, but it’s slowly improving. A computer is now as good as a person. Already Google’s algorithms can accurately match child and adult photos of the same person, and Facebook has an algorithm that works by recognizing hair style, body shape, and body language ­- and works even when it can’t see faces. And while we humans are pretty much as good at this as we’re ever going to get, computers will continue to improve. Over the next years, they’ll continue to get more accurate, making better matches using even worse photos.

Matching photos with names also requires a database of identified photos, and we have plenty of those too. Driver’s license databases are a gold mine: all shot face forward, in good focus and even light, with accurate identity information attached to each photo. The enormous photo collections of social media and photo archiving sites are another. They contain photos of us from all sorts of angles and in all sorts of lighting conditions, and we helpfully do the identifying step for the companies by tagging ourselves and our friends. Maybe this data will appear on handheld screens. Maybe it’ll be automatically displayed on computer-enhanced glasses. Imagine salesclerks ­—or politicians ­—being able to scan a room and instantly see wealthy customers highlighted in green, or policemen seeing people with criminal records highlighted in red.

Science fiction writers have been exploring this future in both books and movies for decades. Ads followed people from billboard to billboard in the movie Minority Report. In John Scalzi’s recent novel Lock In, characters scan each other like the salesclerks I described above.

This is no longer fiction. High-tech billboards can target ads based on the gender of who’s standing in front of them. In 2011, researchers at Carnegie Mellon pointed a camera at a public area on campus and were able to match live video footage with a public database of tagged photos in real time. Already government and commercial authorities have set up facial recognition systems to identify and monitor people at sporting events, music festivals, and even churches. The Dubai police are working on integrating facial recognition into Google Glass, and more US local police forces are using the technology.

Facebook, Google, Twitter, and other companies with large databases of tagged photos know how valuable their archives are. They see all kinds of services powered by their technologies ­ services they can sell to businesses like the stores you walk into and the governments you might interact with.

Other companies will spring up whose business models depend on capturing our images in public and selling them to whoever has use for them. If you think this is farfetched, consider a related technology that’s already far down that path: license-plate capture.

Today in the US there’s a massive but invisible industry that records the movements of cars around the country. Cameras mounted on cars and tow trucks capture license places along with date/time/location information, and companies use that data to find cars that are scheduled for repossession. One company, Vigilant Solutions, claims to collect 70 million scans in the US every month. The companies that engage in this business routinely share that data with the police, giving the police a steady stream of surveillance information on innocent people that they could not legally collect on their own. And the companies are already looking for other profit streams, selling that surveillance data to anyone else who thinks they have a need for it.

This could easily happen with face recognition. Finding bail jumpers could even be the initial driving force, just as finding cars to repossess was for license plate capture.

Already the FBI has a database of 52 million faces, and describes its integration of facial recognition software with that database as “fully operational.” In 2014, FBI Director James Comey told Congress that the database would not include photos of ordinary citizens, although the FBI’s own documents indicate otherwise. And just last month, we learned that the FBI is looking to buy a system that will collect facial images of anyone an officer stops on the street.

In 2013, Facebook had a quarter of a trillion user photos in its database. There’s currently a class-action lawsuit in Illinois alleging that the company has over a billion “face templates” of people, collected without their knowledge or consent.

Last year, the US Department of Commerce tried to prevail upon industry representatives and privacy organizations to write a voluntary code of conduct for companies using facial recognition technologies. After 16 months of negotiations, all of the consumer-focused privacy organizations pulled out of the process because industry representatives were unable to agree on any limitations on something as basic as nonconsensual facial recognition.

When we talk about surveillance, we tend to concentrate on the problems of data collection: CCTV cameras, tagged photos, purchasing habits, our writings on sites like Facebook and Twitter. We think much less about data analysis. But effective and pervasive surveillance is just as much about analysis. It’s sustained by a combination of cheap and ubiquitous cameras, tagged photo databases, commercial databases of our actions that reveal our habits and personalities, and ­—most of all ­—fast and accurate face recognition software.

Don’t expect to have access to this technology for yourself anytime soon. This is not facial recognition for all. It’s just for those who can either demand or pay for access to the required technologies ­—most importantly, the tagged photo databases. And while we can easily imagine how this might be misused in a totalitarian country, there are dangers in free societies as well. Without meaningful regulation, we’re moving into a world where governments and corporations will be able to identify people both in real time and backwards in time, remotely and in secret, without consent or recourse.

Despite protests from industry, we need to regulate this budding industry. We need limitations on how our images can be collected without our knowledge or consent, and on how they can be used. The technologies aren’t going away, and we can’t uninvent these capabilities. But we can ensure that they’re used ethically and responsibly, and not just as a mechanism to increase police and corporate power over us.

This essay previously appeared on Forbes.com.

EDITED TO ADD: Two articles that say much the same thing.

Posted on October 5, 2015 at 6:11 AM100 Comments

Comments

name October 5, 2015 6:35 AM

The underlying technologies are being developed today, and there are currently no rules limiting their use

Neither are there rules limiting the use of blockers of those underlying technologies, such as scarves, as far as I know.

ianf October 5, 2015 6:58 AM

From what you’re saying, I can only surmise that all the FiveEyes’ spook agencies, the Eastern etc dictators, and those private enterprises that can afford the back-ends needed for massive data mining, continuously & cumulatively download tagged/named pictures from all the accessible “free” services like the Instagram, Flickr, Fuckfacebook & Picasa, (and countless others) in order to build up databases for facial recognition in the future, in as yet not invented product- & usage case scenarios. Is that concise summary by and large correct?

Clive Robinson October 5, 2015 7:38 AM

@ Bruce,

The companies that engage in this business routinely share that data with the police, giving the police a steady stream of surveillance information on innocent people that they could not legally collect on their own.

This is the real danger to society of any new technology or methodology.

If there is not strong regulation on companies then abuses will without doubt occur, it’s something history tells us over and over, yet it’s a leason we don’t want to acknowledge as it gets in the way of the carefully constructed myths such as the “American Dream”.

We can see in the US the libertarian argument for “freedom” only it’s not the freedom most would want, in that it’s the freedom to abuse and repress those weaker and set up faux markets to rob or rent people into poverty.

The idea that a Government is benign towards it’s citizans because of preventative legislation is a sick joke unless the same constraints are applied to all. Because if Peter can not do something but Paul can what is to stop Paul passing the results back to Peter, and worse what is there to stop Peter puting Paul into business for just that purpose.

The US has the notion of “Fruit of the poisoned vine” where even though there is evidence of a crime, it can not be used if it was obtained by unlawful means. Because the trust is broken, and thus the evidence it’s self becomes untrustworthy and subject to more than reasonable doubt.

The idea of “outsourcing” activities to get around legislation to keep the Government and it’s agents in check, is more abhorrent than that of unlawfully constructed evidence, because of the openness of the process to abuse.

Further such “outsourcing” always has an abuse process built in which is the result of a faux market being created to divert tax income into profit for a selected few. Such things occur because a fraction of that diverted tax is used on elected and other government officials to ensure the process continues for their mutual benifit, not that of the general tax payer who just sees an increased tax burden.

Daril Eldrige October 5, 2015 7:56 AM

“Don’t expect to have access to this technology for yourself anytime soon.”

OpenCV (FLOSS) allows you a basic but functional form of facial recognition. I rigged one up to a raspberry pi for fun and it works great. You can stream youtube videos or CCTV footage and watch the software draw little squares around people’s faces, trying to recognize them. Naturaly, it’s nowhere near as accurate as the cutting edge solutions used by big brother companies and TLAs, but the basic technology is very affordable (for better or worse).

http://docs.opencv.org/modules/contrib/doc/facerec/facerec_tutorial.html

William Payne October 5, 2015 8:01 AM

The fact is that this technology is already in place. Protesting won’t do much good. Indeed, it may do significant harm. The best, wisest, and sanest course of action is to keep calm and try not to trip any behavioral triggers: Don’t make any uncharacteristic journeys; Don’t make any unnecessary journeys. Stick as close as possible to your daily routine. When in public, don’t sit near to (or make eye contact with) any strangers. Maintain a neutral expression and passive bodily posture at all times. Avoid looking up, and try to keep your breathing, pulse and emotional state calm and even.

NotYouAgain October 5, 2015 8:02 AM

“Despite protests from industry, we need to regulate this budding industry.”

I have never agreed more with you, Bruce.

loco October 5, 2015 8:10 AM

The underlying technologies are being developed today, and there are currently no rules limiting their use

While this statement certainly holds true for the U.S., more civilized countries do have rules governing these cases, e.g. the European Union with their purpose based processing principle.

loutish tights October 5, 2015 8:15 AM

@name
“Neither are there rules limiting the use of blockers of those underlying technologies, such as scarves, as far as I know.”

Scarves covering the lower half of your face might help moderately, but they are certainly not the most efficient countermeasure. What you really want to address is the “golden triangle,” the diagnostic nodal points between your pupils and the bridge of your nose (that’s where the business end of facial recognition happens). Large sunglasses and/or a baseball cap with a wide visor will kill even the most sophisticated algorithm.

It’s a sad world when your own face becomes an attack vector.

Peter A. October 5, 2015 8:18 AM

“Walk by a policeman, and she will know your name…”

Is it political correctness raised to the level of grammatical incorrectness?

Hopefully, just a typo 🙂

Dr. I. Needtob Athe October 5, 2015 8:37 AM

From the 1985 Terry Gilliam movie, Brazil:

Sam Lowry: My name’s Lowry. Sam Lowry. I’ve been told to report to Mr. Warrenn.

Porter – Information Retrieval: Thirtieth floor, sir. You’re expected.

Sam Lowry: Um… don’t you want to search me?

Porter – Information Retrieval: No sir.

Sam Lowry: Do you want to see my ID?

Porter – Information Retrieval: No need, sir.

Sam Lowry: But I could be anybody.

Porter – Information Retrieval: No you couldn’t sir. This is Information Retrieval.

braff October 5, 2015 8:48 AM

@name & loutish tights:

There are such laws &/or rules, they just haven’t been spread to all places and situations yet. At this point in time, you cannot expect to be allowed to enter a bank with anything that obscures your face enough to make identification hard. And that seems pretty rational and acceptable to most people, including me. However. Expand this to other situations. Maybe you will not be able to buy something in a shop with a card if you are “hiding your face” from cameras. You might be asked to show your face by the ATM when you want to withdraw cash. Maybe wearing a hoodie and a baseball cap will be reason enough for security officers to throw you out from the subway station.
Also, many countries have laws against hiding your face when participating in a public protest, since it is seen as preparing for law-breaking. Which again might seem reasonable, but again can be seen as opening the crack in the door to a police state a bit wider.

Michele October 5, 2015 8:48 AM

“large sunglasses and a wide visor”. When it becomes common use for surveillance, will these be banned? Thinking about the burqa ban in France…

ianf October 5, 2015 9:12 AM

@ William Payne […] “When in public, don’t sit near to (or make eye contact with) any strangers. Maintain a neutral expression and passive bodily posture at all times. Avoid looking up, and try to keep your breathing, pulse and emotional state calm and even.

That should work. Also a GREAT opportunity for a Prescribed Public Posture app/startup [concise app name REDACTED]. From this post to IPO in 10 months. Parallel income stream from [PAID OPTION] kinetic sensor inputs hidden in the end user’s clothing alerting her/him of improper position. Another [PAID OPTION] on-demand analysis of others’ postures by visual clues.

Rusty October 5, 2015 9:23 AM

It is time we all studied Lon Chaney (senior)’s makeup techniques. For those who don’t know, he was a great silent movie actor and was known as “The Man of 1,000 Faces” as he would radically change his appearance from movie to movie. Sometimes, when watching his movies, even if you know a particular character is Chaney, it sure doesn’t look like him!

So, it’s time to break out the false eyebrows, cheek implants, teeth overlays, body padding or corsets, etc.

Rusty

Chuckly October 5, 2015 9:33 AM

@name
@loutish tights
@braff

In New York City it’s illegal for more than two people to wear masks in public. It’s an arcane law that goes back to 1845, and was updated as recently as 1965. Over the years it has been used against groups ranging from the KKK to Occupy protesters.

But, naturally, it’s just fine for the police to wear balaclavas to conceal their faces when they swarm protesters.

FNK October 5, 2015 10:16 AM

I was home invaded without a warrant because someone who had previously rented the same address was accused of a car theft.
Swat teams invade wrong addresses and kill people but are let off on the whoops, my bad, excuse

At the very least, the value that should be placed on this information should be considered third rate and not actionable and a stiff penalty such as being permanently impeached from law enforcement employment should be the penalty along with damages times ten for holding false information and and acting on it.
The information should be considered false until proven reliable meaning it should be considered third rate information until it is upgraded by proof to second rate. It should never hold first class status

Christophe October 5, 2015 10:59 AM

Dear people,

It happens I have some experience (+/-9 years) concerning building management systems platforms and have a good idea on how systems should communicate to third parties but also too their receivers (BMS-AMS-central platforms) the IoT is not a good solution, let say even absolutely not save and stable solution as it is proposed today by telcom operators.It is pure commercial based.
In fact I’m electronic security specialist (alarm panels,cctv,control access,fire detection,…) but for my main project I had to handle everything as project manager/engineer to be integrated in one single software platform for several control rooms, to much to tell here.
My conclusion is that IoT is not safe, not accurate since it depends on the security,communication and higher level monitoring platform.But again to difficult to tell in one message.

I’m activelly looking for partners and lobbiers to improve and change the way of thinking some telcomm companies tries to broadcast but also to improve security etc…

Please contact me on skype: genitronicscg

Marshall October 5, 2015 11:06 AM

Having trouble imagining what a meaningful regulation of this sort of thing would be. In the capitalist world, “meaningful regulation” is something of an oxymoron: you get self-interested compliance or none at all and lie about it. It seems more productive to work for social justice, such that eg poor people are not harassed for parking tickets they are not given the opportunity (income) to pay. Such that I, merely a typical member of society, don’t need to hide who I am and what I do. … Dream on, young hippie.

Daniel October 5, 2015 11:10 AM

@name

just plain wrong. This has been discussed on this blog many times in the past. Even in the USA there are laws banning defense mechanisms.

In any event, can we stop sugar coating this with sweet talk like “personal services”. This is identity slavery. That’s what it is, pure and simple. It is slavery being presented to us as “for our own good”.

Daniel October 5, 2015 11:19 AM

@Marshall

What does meaningful regulation look like? I’d suggest to begin with two parameters. First, a limitation on collection–the license plate numbers is a good example. There is no reason why a private entity gets to rape the public space in such a manner. Data that is never collected can never be used. Second, serious limitations with serious penalties regarding content retention. For example, if a private entity needs security camera they are allowed to keep that data for any length or time necessary to identify a crime and no more, say 30 days. Then it is deleted forever.

It really isn’t hard. The problem is that technology continues to outstrip the ability of the social organization to adapt.

Lisa October 5, 2015 11:34 AM

Why do I have the feeling that in the near future, burkas will become the preferred choice for both men and women who are privacy conscious?

I hate burkas and other type of clothing which is associated with gender repression. But I cannot think of a better choice, especially compared to alternatives like wearing masks, which are much more uncomfortable and less accepted by most societies.

On the other side, if a critical mass of people use burkas or other religious head covering to hide their identies, some governments might ban it, which may help with gender equality, even if overall human rights are decreased due to diminished privacy.

Orwell Squared October 5, 2015 11:47 AM

Once again the security mob assume no counter-measures will be used to fuck with their technology.

Simplistic in the extreme.

Already the Japanese plan on releasing sunglasses next year that prevent around 90% of facial recognition in tests so far. Therefore, just like the crypto wars where offense and defense are always battling for domination, the same will happen in biometrics and suitable countermeasures.

The same is true for some continual forms of open-source encryption and other platforms arising to screw with the Borg complex they have in train for the internet.

In the end, even if they achieved 100% tracking on-line and real time, all the time, you’ll still find the Minority Report concept is a fantasy. Why? Because of quantum uncertainty at the most fundamental levels of the universe.

I expect they may get to the ‘probability wave’ function re: criminogenic behaviour, but uncertainty will always be present, making their goal an impossible task.

Therefore, it can be concluded that people would still throw Stasi out of 15th story windows in the future, despite their best efforts.

This will probably coincide with the collapse of the US global empire – interestingly enough, it is already in its death throes based on studies of earlier hegemons.

Alien Jerky October 5, 2015 11:59 AM

A few scenarios:

1) Steroidal medication, or some medicine that causes inflammation, face characteristics can change drastically due to the inflammation so would not match the scan.

2) Severe injury that causes scars or such as a broken nose or facial skeleton damage, would not match the scan, and would change over months during healing with possibly permanent changes. Bandages not easily removed simply to check identity, and even with removal of bandages would not match database.

3), Hollywood latex makeup can drastically change appearance and if done properly is not noticeable. Hollywood has done a good job developing that technology.

4) some people simply look very similar to other people even without being related causing false positives.

paul October 5, 2015 12:10 PM

Not only will this information be collected, it will also be disseminated to the public. Breaches are inevitable.

Wait for the app that hooks into the API of whichever official face-recognition server you want…

Hawkshaven October 5, 2015 12:25 PM

More worrying is the new technology within computers to log into them using facial recognition.

This option was offered to me when I purchased my latest laptop, and I have no idea whether it’s Lenovo or Microsoft that are storing the images.

Personally I turned it down, and opted out of all the sharing settings I could, but it took days, and it was only because I subscribe to blogs such as this that I found out about background torrenting for updates etc.

The average user is giving away their biometric data for free without ever understanding the risks.

ianf October 5, 2015 12:33 PM

Westerners, whether men or women, cloaked in abayas or burkas, will still inadvertedly disclose non-Oriental origin by their gait. And gait research has come at least as far, if not further (sports analysis), as facial features recognition. So, effectively, we’re doomed.

Or maybe not. Clearly, one solution may be everybody who is security conscious wearing extreme cosplay costumes with no less extreme makeup. Then we could have periodic e.g. “KISS-lookalike” days, to confuse and wreck havoc in the data mining mainframes until they overheat & die. Also, I am told, the KISS makeup is a veritable bimbo magnet (for both sexes).

Alien Jerky October 5, 2015 12:39 PM

@Hawkshaven

The average user is giving away their biometric data for free without ever understanding the risks.

I do not use Facebook, twitter,… and I avoid all I can, and opt-out of everything I can.

However, if I go to a friends house for dinner, someone takes a picture, posts it to Facebook (or similar), where I am identified in the photo, My face is now entered into the system even if I have no knowledge of the photo begin taken. All because someone else is into the narcissistic emptiness that propels them to need others recognition of their insecurity to be noticed and liked.

Consider The Source October 5, 2015 12:45 PM

As Bruce mentioned, the drivers license (and passport) databases provide an excellent source to match live images against. What can we do to degrade the quality of those enrollment images vis a vis biometric matching knowing those images must be valid for face-to-face encounters?

Are there subtle changes that a human would not necessarily notice, but provide an extreme mathematical variance in biometric analysis?

This is a page from a firm that provides facial recognition software explaining their constraints.

http://www.neurotechnology.com/face-image-recommendations-constraints.html

Bob S. October 5, 2015 12:46 PM

“we need to regulate this budding industry”

…but, that’s not going to happen in the USA, or most of the world.

Indeed, like with most electronic technologies in play these days, it’s like the lawless wild west….for governments and corporations. Regulations, if any, are promulgated to force compliance by targets/adversaries.

That’s in large part due to targets/adversaries accepting their fate in return for the fun and convenience of electronic communication. I suppose one could imagine one day “they” will go too far which will promote mass demand for change.

In my view, “they” already have gone too far time and again, however.

Anura October 5, 2015 1:28 PM

Unless there exists a way to make your facial structure look different, then this is a foolproof way to protect the country from terrorism, and thus we should all submit.

Now, if you’ll excuse me, I’m going to spend the rest of my life at sci-fi conventions.

ratnerstar October 5, 2015 2:18 PM

Everyone take a deep breath a repeat after me: Face recognition from uncontrolled video does not work very well. Face recognition from uncontrolled video does not work very well. Face recognition from uncontrolled video does not work very well.

This is not to say you shouldn’t be thinking about the privacy issues or working to develop meaningful regulations. Those are all good things. But the brave new FR world isn’t here yet and probably won’t be here from some time.

Taz October 5, 2015 2:58 PM

We can’t change any of this. At least not until the entire order is turned upside down (which may or may not happen).

So I’d sure like to see “actionable” steps we can take to nullify constant surveillance. things we can do to “stick it to the man”.

Collective action rarely works because one can rarely gain agreement on anything. But people finding ways around such systems? That spreads quickly.

An example:

This is probably snake oil but the idea is attractive. Carve out a “members only” section of internet. Butt sniffers not welcome.

https://www.youtube.com/watch?v=o4AduHdBu28

Perhaps this is only CJDNS?

In any case, I think many of us would like to learn of ways to hide. Perhaps while painting graffiti to thumb our noses at government creeps.

The objective? To render such government efforts so ineffective – all of their perps get fired. That’s what we want as payment. Utter destruction of butt sniffer’s livelihood.

Anyone with a homeland security history should pay for it for the rest of their lives.

Ever wonder what happened to Stasi? That’s a good subject for a book.

albert October 5, 2015 3:20 PM

FR tech needs to advance a lot more before we throw away our ID cards. If everyone had implanted ID chips*, LE wouldn’t need FR at all.
.
I expect the demand for clandestine plastic surgery will increase dramatically, as well as regulation for said surgery.
.
The problem with surveillance is that those who need it most are the ones conducting it.
…………
. .. . .. _ _ _
* anyone here disagree?

Eric October 5, 2015 3:46 PM

I would be curious to know the extent to which things simple things like a hat and sunglasses would throw off the facial recognition software.

JdL October 5, 2015 3:59 PM

Despite protests from industry, we need to regulate this budding industry.

Pipe dream. First of all, government is never going to regulate itself, so the most that new legislation/regulation would do is affect private businesses. And how will government enforce these new rules? More Big Brother, that’s how, just like the disastrous drug war. Selective prosecutions, payoffs under the table: if you love these, you’ll love Bruce’s idea.

d33t October 5, 2015 4:13 PM

Where did I put my CMOS / CCD, pixel burning, medium powered, RGB
laser rifle with night vision scope? <– Joke … It’s illegal to be funny (or stupid) now right?

I’ve never tried this at home or office BTW and I don’t suggest doing so yourself, probably wouldn’t work at all anyway 🙂 Maybe it would?

Björn Persson October 5, 2015 4:21 PM

Alien Jerky:

some people simply look very similar to other people even without being related causing false positives.

And in a database of hundreds of millions of people, doubles are probably more common than one might think. And many users of the system won’t care about false positives as long as they’re relatively rare.

Once in a while the computer will make a mistake. A low-quality picture makes the matching uncertain, but the computer still selects one identity as the most likely match. People tend to trust that what’s written on a computer screen is the Truth, and suddenly you’re arrested for something you haven’t done. If you’re lucky the mistake will get cleared up the next day, after you’ve only been abused a little.

Of course humans also make mistakes, but humans instinctively know when they are uncertain. They’ll proceed with caution and try to verify their suspicions. But if a computer says that the guy over there is a criminal, then the guy over there IS a criminal.

PQ October 5, 2015 4:58 PM

A family member had her license revoked, so I was driving her around in my car. She was still a registered driver of my car for insurance purposes while we fought for the renewal of her license. One night, driving home on Rt 2 from a concert in western Massachusetts, I was pulled over by a state trooper who had scanned my license plate and wanted to check that the other registered person wasn’t driving my car.

As soon as he got to the car window, he had to realize that I was driving, and she was the passenger. He told me there was nothing wrong, and that he was checking to make sure she wasn’t driving. We both had to present our licenses. However, he went on to demand information about where we had been, what the concert was, where we were going, etc. It was a very disturbing encounter, especially when I realized this could happen to me repeatedly anytime I was driving my car or her car in the state.

I got the impression this trooper routinely used his license plate scanner to “hunt” for victims who could contribute to the state’s coffers, or who might add to his arrest tally.

Alice October 5, 2015 5:44 PM

@Bob

it’s like the lawless wild west

Yes indeed. Most of the reasons are created by Google, by introduced “Opt-in by default”
policy to global companies. Now, most of the companies are eager to track their
customers to collect more information for profit.

@ALL
Have you ever played “Watch_Dogs”(game)? It has automatic face recognition on
smartphone. If you still not played it, I recommend it. That is the future of
USA which most of citizens can’t resist.

Government will use driver license database AND DV database(*) to track
USA citizens and non-USA people(such as illegal immigrants).

Jon October 5, 2015 5:58 PM

SOLUTION:

Poison the databases.

Fill them with as much garbage as you can. Fill out all the forms with plausible but wrong information. Every chance you get, give a different, common, name.

Go get rewards cards – As many as you can, in as many different names and addresses as you can think up.

Use canary traps – A slight mis-spelling of your name, and anyone who then uses it, well, you know where they got it from.

I’m seeing a next generation of criminal hacking not so much copying the data and putting it out there in public, but flooding the database with very similar but defective records. Tamper with a few legitimate records, wipe the logs, and the entire database is worthless.

Not entirely unlike the current U.S. ‘No-fly List’.

Have fun,

Jon

paranoia destroys ya October 5, 2015 6:00 PM

5 years ago Apple’s iPhoto was able to correctly match my 80 year old mom based upon the photos of her as a child that were the first ones we added to it.

Clive Robinson October 5, 2015 6:01 PM

@ d33t,

I’ve never tried this at home or office BTW and I don’t suggest doing so yourself, probably wouldn’t work at all anyway 🙂

Want to guess again?

It can work at two levels. The first is it just overloads the sensor and causes temporary flare. The second is it actually burns holes in the sensor causing permanent damage.

The amount of power required is dependent on a number of things such as the frequency/colour, how coherant and how tightly collimated the beam of the laser is, as well as the absorbtion and reflection effects of lenses and filters in front of the sensor causing attenuation of the beam at the sensor, and finally the charecteristics of the sensor itself.

If all is in your favour a 3mW laser will be more than sufficient to cause flare but borderline on permanent damage. A 50mW laser of the type used to give green light by crystal conversion, will easily set a match head on fire at a couple of meters and burst darker colourd balloons at similar distances, and will fairly quickly damage not only the CCTV sensor but your retina as well. Higher powered pulse lasers will damage the sensor so fast that there is a good chance it will not be visable as an initial flare on the video signal.

If you want to maximise the blocking effect then three laser diodes of near IR, red and green have been found to render Police surveillance cameras of the type found mounted on street lights to watch peoples houses.

One way to increase the effect of a laser pointer is to first focus a telescope on the camera and then shine the pointer into the eyepiece…

Black and white CCTV sensors whilst more sensitive in low light conditions that colour sensors are also much more prone to damage by coherant collimated light sources.

Oh and if you can get an industrial CO2 laser, then you don’t need to worry to much about getting it to shine in the camera lense, just cut the whole camera off of it’s mount 😉

albert October 5, 2015 6:25 PM

@Björn Persson,

“…And many users of the system won’t care about false positives as long as they’re relatively rare.

Once in a while the computer will make a mistake….”

LE very often doesn’t care about anything beyond arresting a suspect and closing a case.

Computers don’t make mistakes; programmers do. Garbage In, Garbage Out. Even ‘good’ data can be processed by bad algorithms.

“Ladies and gentlemen of the jury, I ask for a conviction of Jane Doe, because our FR software is 95% accurate, whereas her alibi is only 85% believable.”

Statistics is useful, but not for determining guilt or innocence.

. .. . .. _ _ _

Elliot October 5, 2015 6:34 PM

Facial recognition is included in software surrepetitiously. I have a phone that has a camera which supposedly doesn’t have facial recognition, nor ‘face unlock’, but after a few months, if I took a photo of a person, the editing routine would pop up a face recognition square over the eyes when cropping the photos. That feature is not in any of the documentation on the phone (and I have never updated the firmware so it’s the same version).

Then I took a photo of a colander, and the cropping routine “saw” several pairs of eyes in the pic. I was then unable to crop it at all unless I chose one of the “faces” to start from. I don’t use Picasa or anything that backs up the phone to the cloud, but if FR thinks colanders are people, there are many photogenic colanders and polka-dot fabrics people might like to add to their galleries 😉

d33t October 5, 2015 6:34 PM

@Clive Robinson

I was being little catty / cautious .. for certain it would work, at least well enough. One could bring along a selection of lenses to try locally, to aid in reducing the loss in power after lenses at the other end too.

A high powered spitball gun with automatic toilet paper / glue feed would be good for a follow up to the laser shot too.

LessThanObvious October 5, 2015 7:00 PM

I fear the public is too ill informed to even begin to raise the attention level to the point required in order for legislators to take any action. Not only is it possible for this technology to be abused, it is virtually guaranteed to be abused. I have only marginal fear at the present state of technology as they don’t seem to be able to match people at times they would clearly want to, like when an unknown person robs a store or even a bank. I wish I could be optimistic, but I just don’t see the legal changes required being even a possibility at present. All the money and power in the public and private sector is on the side of facial recognition without limitation. The fact that ubiquitous facial recognition and the evil and control that come with it is inhuman and against all principles of freedom and American values will not stop them, or even so much as give them pause.

CyEnt October 5, 2015 7:30 PM

Blocking technologies like scarves (and Hoodies) declare your political allegiance out loud.

Does a TV show want to portray a Baddie Stereotype? What does he wear? Baseball hat and hoodie.

Walk into a shop wearing that attire…. the floor walkers watch you double close.

Meet a policeman whilst wearing that attire… He looks double hard for reasons to not like your face (or race) so he may apply a “don’t like your face” harassment law.

Alien Jerky October 5, 2015 7:40 PM

Hack the system and re-reference every face of a baddie to be a senator or congressman

ianf October 5, 2015 11:58 PM

@ Alien Jerky indexed […] “because someone else is into the narcissistic emptiness that propels them to need others recognition of their insecurity to be noticed and liked.

Whoa! BIG WORDS… just looking for some tush. However, there’s no other strategy to counter repeat of such than by essentially severing the connect to said people. Start by disowning the worst offenders, and learn to detect presence of cameras in your line of sight. Auntie I’ve known disliked being shot in profile so much that she developed an instinct to turn towards any pointing camera with a me-babe smile on her face. If she could do that, so can you in reverse (or obscure the image in some “natural” fashion).

@ ratnerstar […] “Face recognition from uncontrolled video does not work very well.

Depends on how you define “doesn’t.”
Depends on how you define “work.”
Depends on how you define “for whom.”
Depends on how you define “well.”

@ PQ […] “got the impression this trooper routinely used his license plate scanner to “hunt” for victims who could contribute to the state’s coffers, or who might add to his arrest tally.

YOU DON’T SAY! But, since that’s what essentially his job amounts to, would you, the enemy of ordinary hardworking people, rather see him draw unemployment, or turn to crime—illegal I mean.

@ Charlie […] “This is why wearing a hoodie is integral to OPSEC.

Rrrrright, provided you wear it in front to occlude your face. A hoodie SCREAMS… something. Obviously I don’t get it, because main principle of any OPSEC is NOT TO DRAW attention to oneself. If that means sporting a 6-gal white Stetson hat to blend in with the natives, so be it. At least you’ll be protected from FR-drones AND melanoma.

Moreover, a hoodie is a stock Hollywood semaphore that someone is a blue-collar hood, ergo unable to think beyond the most obvious visage camouflage (and that meme has already spread all the way to Paris, France!). This is criminal Bruno following a wounded policewoman moments after he stabbed her—she was corrupt, so he has our sympathy (“Ne Le Dis À Personne”). Plus, it didn’t prevent him from being photographed, so there!

tyr October 6, 2015 12:54 AM

One big fad in USA is to mount video cameras on
the traffic signals. Ostensibly to monitor the
traffic flows however once installed bureaucrats
think up all kinds of ways to use and improve
the technology. Most people are completely out
of the loop on what governments have been doing
for years. You only need to upgrade one camera
on a busy arterial street and feed it into a
recognition system and it will collect huge
numbers of faces with matching licsense plates.
Freeways have had this kind of camera systems
in place for years. There wasn’t any vote to
decide before the stuff was installed either.

I saw a camera with multiple focus capability
yesterday on someones smart phone so you can
take pictures of multiple faces and have all
of them in focus. That looked ideal for the
kind of facial recognition needed scanning a
busy area.

thevoid October 6, 2015 1:33 AM

@Alien Jerky, Björn Persson

4) some people simply look very similar to other people even without being
related causing false positives.

there is a saying that you you travel far enough, you will meet your double. it is true. i know of three cases personally.

the first i did not witness myself, but people had been telling my mother there was a security guard at a local colliseum that looked exactly like her brother. after years of people having told her this, she witnessed it herself, and was quite surprised how true it was.

the other cases i did witness personally. one is a bus driver who looks almost identical to a good friend of mine, down to hairstyle/facial hair, build, and gestures. only a few facial features were (slightly) different. perhaps a computer might recognize those differences, but i bet most humans can’t.

the last case i saw just a few weeks ago, was someone who looked exactly like ME. my mother called to tell me about it, and i turned on the tv, to see someone who looks pretty much exactly like me. the guy had the same build, the same hair (color, length, curl), the same beard, etc. my mother said his gestures were like mine as well. looking straight on, he didn’t look exactly like me, but that was about the only angle he didn’t look like my identical twin, and still most of the features were the same. once again, most humans at least probably wouldn’t be able to tell the difference.

having doppelgangers is nothing new though. many dictators use them (famously
Saddam). Hitler is said to have had a few as well.

ianf October 6, 2015 2:37 AM

@ tyr […] “Most people are completely out of the loop on what [local/ regional transit authorities, rather than] governments have been doing for years. […] Freeways have had this kind of camera systems in place for ages. There wasn’t any vote to decide before the stuff was installed either.

They were put up for traffic flow/ congestion controls—the greater common good, so didn’t require debating. Then the resolution of the original cameras wasn’t up to FR-tasks.

Now, on the other hand, the summons for the speeding fine includes a photo of the car taken from a portal some 20m ahead with quite-recognizable face of the driver (~1cm² size), and with that of any passenger’s (machine-decoded, then) pixelated. This presumably to assure the payee that the authority RESPECTS THE PRIVACY of his bit on the side—even if the summons challenged and winds up in court!

@ thevoid doppelgangers

Not to mention the hordes upon hordes of obviously cloned Elvis impersonators!

ianf October 6, 2015 4:52 AM

@ remo

That wasn’t @Bruce’s cameo, but a major Major, if not THE key speaking part!

What I don’t get is that (no doubt unbeknownst to canned Bruce in advance) the filmmakers taped a tablet to a scooter rather than rent a telepresence robot proper (or at least tape it to upmarket Segway model), because they thought nobody would notice. Well… I noticed. It’s a thankless job but somebody’s got to do it.

Björn Persson October 6, 2015 5:51 AM

albert:

Computers don’t make mistakes; programmers do.

That’s true in one sense, but it’s applicable here only if there is a known algorithm that, if correctly implemented, will identify people with 100% accuracy and not a single false positive, even in the presence of identical twins, lookalikes and less than perfect pictures. I dare say that there is no known such algorithm.

spotty nail October 6, 2015 5:55 AM

@Consider the Source
“Are there subtle changes that a human would not necessarily notice, but provide an extreme mathematical variance in biometric analysis?”

What could work:

-Splitting the photo in two and creating a mirror image of one side of the face
-Moving the eyes closer together / further apart by a bit
-Pushing your nose up or down slightly

What doesn’t work that well:

-Facial hair
-Clear-lens glasses
-Different hair styles
-Hats or other headgear
-Traditional (moderate) makeup

John Campbell October 6, 2015 6:11 AM

I am not so sure this will work across the board. There are people who seem to be from clone-lines.

I (and my wife) have seen others who “look more like me than I do” and have been asked “How long have you been playing the violin” as she held up a photo of “me” playing a violin in the local paper. Even I couldn’t tell us apart from the photo. One of “me” was a toll collector on the Pennsylvania Turnpike, too.

I have commented that the whole “God broke the mold” remark should be seen as more of an insult, ‘cuz, when He gets it right, He stamps out as many copies as He can… and I know that I’m not from very early in the manufactured series.

(chuckles)

[HUMOR MODE=”Tin-Foil Hat”]
Facial recognition is just a scam to cover up either the RFID chips placed in our bodies or the advanced “remote” DNA scanners that have been built into cameras.
[/HUMOR]

Yvonne Kroner October 6, 2015 6:13 AM

@ratnerstar:
“Everyone take a deep breath a repeat after me: Face recognition from uncontrolled video does not work very well.”

Thanks for the therapeutic mantra. Alas, the sad truth is that facial recognition works actually rather well. So much so that local law enforcement is using run-of-the-mill CCTV cameras (we’re not talking national intelligence in lab conditions here) to automatically identify individuals in the street. Guess who’s at the forefront rolling these out in a friendly street near you? Yes, it’s the UK again. These are not experiments, this is real and it’s happening now.

Here’s a company that advertises automated FR specifically for run-of-the-mill CCTV cameras:
http://www.nec.com/en/global/solutions/safety/face_recognition/NeoFaceWatch.html

Here’s the scheme in practice (last year):
http://www.engadget.com/2014/07/16/uk-police-first-trial-face-recognition-tech/

The fact that the technology works well enough with standard CCTV equipment sends a shiver down my spine when I see long range lenses (the type surveyors use on total stations) mounted on the wall in train stations like Paddington. If they can get FR to work with run-of-the-mill cameras, imagine what they’re doing with big telescopic lenses.

Jan Clouden October 6, 2015 6:37 AM

Re. “some people simply look very similar to other people”

This approach is misguided. Computer FR works very differently from human FR. People who look very similar to us can be easily distinguished by the algorithm. Similarly, people who are easy for us to distinguish can lead to a false positive in computer FR.

The reason is that FR algorithms don’t “look” at faces. They detect diagnostic points in the face (the golden triangle being key, as mentioned by someone else above). The algorithm then measures the distance between these diagnostic points and mathematically finds the nearest match from a database, based on geometric calculations.

Therefore, the most efficient way to fight back automated FR is to prevent the algorithm from finding these diagnostic points at all, so the calculation cannot be initiated. Big shades and caps are indeed your best bet when in public.

Jonathan Wilson October 6, 2015 7:03 AM

One area I see this being even worse for the general population is with the fact that more and more private and municipal security cameras are being made available to law enforcement.

So cameras in places like retail shops, transit stations, fast food joints, shopping centers, airports, movie theaters, banks, government offices, parking garages, office buildings, schools and everywhere else being combined with cameras out on the street into one giant camera feed law enforcement can use when they need to.

Think of that scene in that one Batman movie where they have every camera in Gotham visible on a massive bank of monitors in the Batcave. Now imagine that combined with automated facial recognition. Imagine a facial recognition computer system scanning every camera looking for matches to a list of anyone law enforcement (or entities law enforcement is acting on behalf of) has a reason to want to find.

Oh and dont assume that everyone on that watch list is “bad” or should be in jail either. It wont just be drug dealers or mass murderers or terrorists or other bad people they are looking out for either.

George Orwell got it right when he spoke of “big brother” knowing everywhere we go and looking for anyone doing something that could be considered “thoughtcrime”. He just got the date about 20 years too early…

Curious October 6, 2015 8:26 AM

I am inclined to think that the marvels of facial/body recognition would sink even lower in the eyes of people, if it is found out in the future that such things are best used by others in combination with other identifying features. So, someone into tracking you might not be simply happy with knowing where your mobile phone is, and might not simply be happy by having you more or less identified via facial recognition, but will then only be happy if both, or more systems identify you at once.

albert October 6, 2015 12:24 PM

Photo/video surveillance certainly has some beneficial uses, and even in those situations, can be abused. (‘Red light’ cameras are almost always abused when first installed, by drastically reducing the yellow light duration. Once folks learn: ‘DON’T GO THROUGH RED LIGHTS!” [and the yellows go back to normal:] things get better).
.
The worst abuse of surveillance is the suppression of criticism and protest. Propaganda in the form of censorship, marginalization, and ridicule is working right now, but as a population reaches the boiling point, martial law ‘needs’ to be invoked. It is then that critics become ‘instigators’ and protesters become ‘terrorists’.
.
All of a sudden, you’re living in a fascist state, and you didn’t even see it coming!
.
. .. . .. _ _ _

AJWM October 6, 2015 2:11 PM

@Dr. I. Needtob Athe

Hah, I’d forgotten that Brazil scene when I wrote the following. Carson has just been asked to do the equivalent of signing The Official Secrets Act, or to turn around and leave.

Carson was tempted to do just that. This smelled too much like he was about to get drafted for something. On the other hand, he was really curious now, and other than teaching at the university, he had no other commitments. “All right, ‘I understand and so agree’. Don’t you need my fingerprint or something?”
“Oh, I got that already from the scanner in the doorbell. If you hadn’t been you, you wouldn’t have gotten in here in the first place.”

(From The Chara Talisman, which is sci-fi. The sequel (The Reticuli Deception) includes a (non-trivial, but theoretically possible with current technology) method to evade routine DNA ID checks, and retina and iris scans.)

ratnerstar October 6, 2015 2:33 PM

Thanks for the therapeutic mantra. Alas, the sad truth is that facial recognition works actually rather well.

FR works rather well under certain conditions.

So much so that local law enforcement is using run-of-the-mill CCTV cameras (we’re not talking national intelligence in lab conditions here) to automatically identify individuals in the street.

Lots of local law enforcement agencies try lots of things, because they get sold a bunch of BS from vendors.

Guess who’s at the forefront rolling these out in a friendly street near you? Yes, it’s the UK again. These are not experiments, this is real and it’s happening now.

There are definitely many, many deployed instances of surreptitious FR around the world. The question is: do they work well? Or, more precisely, are they capable of doing the things people here around worried about. The answer is no.

Here’s a company that advertises automated FR specifically for run-of-the-mill CCTV cameras:
http://www.nec.com/en/global/solutions/safety/face_recognition/NeoFaceWatch.html

I’d advise everyone here to take all FR vendor claims with a very large shaker of salt.

Here’s the scheme in practice (last year):
http://www.engadget.com/2014/07/16/uk-police-first-trial-face-recognition-tech/

The article provides absolutely no information about how well the system works (or what it is precisely intended to do).

The fact that the technology works well enough with standard CCTV equipment sends a shiver down my spine when I see long range lenses (the type surveyors use on total stations) mounted on the wall in train stations like Paddington. If they can get FR to work with run-of-the-mill cameras, imagine what they’re doing with big telescopic lenses.

The issue with accurately performing FR from video is generally not one of resolution (you can get a perfectly usable image from webcam or cheap CCTV camera). It’s one of control over the conditions of capture. It’s also a question of computational power, manpower, and (crucially) the gallery to be matched against.

FR works well under certain circumstances. Roughly from more to less effective:
– 1:1 matching, generally between a live face and a high quality still image. This works well enough that a number of countries have e-gate systems to automate border controls. The system compares the live capture of the face to the one stored on an e-passport; if it matches, it lets you through.
– 1:N/N:N matching of high quality still images. For instance, people applying for identity documents such as driver’s licenses and passports. The systems take the images submitted in the application and search the database to ensure individuals don’t receive multiple documents (“deduplication”). These aren’t 100% accurate and generally require some human review, but the systems work well enough to eliminate most fraud.
– 3D FR. There are a lot of cool things going on with three dimensional FR. It’s not a field I know as much about, but from what I’ve seen 3D systems are very accurate. Unfortunately, they generally are also a) very expensive, b) require special cameras, and/or c) can’t use legacy photographs. This negates most of the various advantages of FR … why not just use iris? Most 3D face systems I’ve seen are used for facility access control.

Those are your big three good FR use cases. From here, things get a lot more iffy.
-Surreptitious 1:W, that is, identifying “bad guys” from surveillance video. Yes, you can do it … sort of. If you’re willing to accept fairly high error rates. And if your gallery of bad guys is fairly limited. And if you have good quality still images of the bad guys to match against. No matter what, you’re going to get a lot of errors … and you need human beings to manually review the results you get. The bigger your target group (both in terms of the individuals you’re surveiling and the database of bad guys), the more people and time you’re going to need to handle exceptions. Oh, and computing power. You’re going to need a lot of that. It very rapidly becomes infeasible.
-Surreptitious 1:N, that is, identifying “everyone” from surveillance video. This is simply not possible right now, unless N is very small (in which case we’re basically just talking about 1:W).

You guys don’t have to believe me, I’m just some dude from the Internet. But that’s basically the lay of the land.

Susan Richart October 6, 2015 5:45 PM

NJ DMV is using FRT. I had to renew my license in person despite the fact that NJ allows 2 renewals using the same photo; I have had only one such renewal.

Why did I have to have do my renewal in person? Because in my prior picture, I was wearing glasses. NJ DMV now requires that your photo be taken without your glasses on so that they can run that photo through FRT.

LessThanObvious October 6, 2015 5:50 PM

@Jan Clouden

“The reason is that FR algorithms don’t “look” at faces. They detect diagnostic points in the face (the golden triangle being key, as mentioned by someone else above). The algorithm then measures the distance between these diagnostic points and mathematically finds the nearest match from a database, based on geometric calculations.”

I wonder if it would be any value to post tagged pictures of yourself where you have manually altered the facial geometry.

albert October 6, 2015 6:15 PM

@Susan,
Try tinted contacts next time 🙂

Seems like your glasses would have to be of the coke-bottle variety to fool FR systems.

. .. . .. _ _ _

tyr October 6, 2015 9:24 PM

I keep remembering the guy from Oregon who was
positively IDed by fingerprints at the Madrid
trainbombing. It didn’t matter that he was never
anywhere near Spain, they just knew it was him
right up until they captured the real bomber.
Fingerprints were sold as infallible and FR is
being hyped the same way. Most of the forensics
is exactly the same way,none of it will stand up
to scientific verification. For most it doesn’t
matter until they decide you are the one they are
looking for. No amount of giant databases or
computer algorithmns is going to fix this until
we get some science into the justice system.

@ianf

They were put in place by the same bunch who
think that static loading is the way to test a
structure. Dynamic load testing is done by the
real engineers (it’s too hard for civil engineers)
because math. There’s a lot of infrastructure in
the modern world that just happened this way.
It is one reason for a lot of cruel jokes about
engineers involved in public works.

@albert

If you can document that (short yellows) you can own
the agency with a lawsuit. I had to force an engineer
to stop using a short yellow once. Her reason for it
was “because I like it”. I explained that when she
went to court that wasn’t a good enough reason for it.
If you standardize you train drivers to know what to
expect and you have a reasonable defense against tort
liability. Drivers have enough to worry about without
random inputs from nitwits.

John October 7, 2015 1:20 AM

You act like people want to protect their identity, sorry but those are edge cases. The next generation will be loving the convenience of the next real life Facebook.
Walk into a shop, bam, we’ll serve your favorite coffee and play music you like. We’ll direct you towards people will common interests and you’ll be able to share your latest life entry with friends with a simple nod of the head*. *telemetary data will collected and shared among our subsidiaries tm.

Clive Robinson October 7, 2015 3:42 AM

@ John,

Walk into a shop, bam, we’ll serve your favorite coffee and play music you like

I was once treated to a little piece of that “hell on earth” you describe by a hotel chain…

They assumed that what you did on your first visit was what you would want from all visits, right down to which jam/jelly you put on your toast in the morning… aghhhhh….

Perhaps I’m usuall in that my favourite in most of the minor things in life is variety, the one thing it taught me is I could not live like the Super Rich are portrayed such blandness would lead to madness.

Curious October 7, 2015 4:06 AM

@John

As for “will be loving the convenience of”, that sounds pretty much like loving convenience as such, which would be true in all cases because of that. And the notion of there being people acting as if they want to protect the identity of others stop making sense once one incorporate the idea of privacy and the concern for privacy needs into it I would argue. Thinking of a purported joy of living life in which telemetry data is used for common services reminds me of some kind of fallacy, in which it is an error in simply assuming that something would be true in all or most cases. Heh, tried to quickly look that particular fallacy up on Wikipedia, but couldn’t find it this time around.

greg October 7, 2015 6:03 AM

It is time we just accept that information technology eventually leads to, well lots and lots of searchable personal information. No amount of do not track flags, or extra legislation will change that fundamental fact.

We need to consider equal access to all that information. Because that information is always to be there now.

Curious October 7, 2015 6:20 AM

@Greg

I’ve pointed out in some other thread that recycled personal data (recurrence) could/would be a consequence if data about people is just there to be grabbed because of how businesses doesn’t respect people’s privacy.

I think you are there in your comment above not really asking for people to accept a future scenario in which there’s lots and lots of searchable personable information, as if asking for acceptance was a meaningful thing to ask for before the fact, or as if people had a say in how todays practices work on the internet, instead, it seems to me that you instead try to trivialize the importance of privacy issues on the basis of some fatalistic view or an ideal view of the future where corporations can toy around with personal data as they see fit.

I can’t help but wonder if some US court will eventually be so brazen as to put forth an argument, in which any privacy right issue brought to the court is brushed off as “unrealistic”, due to the amount of personal data already sloshing around on the internet. Ofc, dumbing down a huge issue about the practices of storing personal information that way would be to try deny future people a right to ‘privacy’.

Joe October 7, 2015 9:05 AM

Clive Robinson: “Oh and if you can get an industrial CO2 laser, then you don’t need to worry to much about getting it to shine in the camera lense, just cut the whole camera off of it’s mount ;-)” Some pretty bad science/technology advice there. You need to be able to actually focus the beam at that distance, which is not trivial. Granted, even a white light source of the right intensity in one spot will be severely effective at this sort of mischief. There are ways to automatically detect camera lenses, too, BTW. The scary part is making sure it’s an actual camera. Only a nut would actually combine these ideas. A telescope will likely diverge the beam so unless it’s measured in watts, you’re going to having a low watts/m^2 measure. It could probably be adjusted for with the right lens. Only a worse nut would want to arrest people for ‘giving people ideas we don’t like’. I’m sure in the UK, everyone describing this technology would automatically be a ‘cyberarms smuggler’/’terrorist’, given what they do if you write a book on how to make firearms (and are stupid enough to show said firearm in the UK? Not exactly clear on that case…), or make bad (and obvious to anyone over 50 IQ) jokes on Twitter.

ianf: There’s a machine that was useless for blocking ALPR technology (due to you know, being illegal and damn obvious when used), but you can automatically disrupt cameras with flare via your own flash bulbs worn near the face. If people don’t take a hint, tell them it’s a religious thing. Might not even be totally a lie someday. Of course, then they’ll think you’re a technophile Amish which will blow their minds. 😛

sam: Continuum and the device the ‘time police’ used that made someone look like a white blur to cameras. Funny how SciFi is always ahead of the curve here!

tyr: Some FBI agent got in trouble for thinking to attempt a collision statistics analysis on fingerprint databases. Excuse was it was ‘unauthorized investigation’ or some such nonsense. Apparently, investigating the quality of the evidence is itself against the rules.

An irony is that the best eye protection against LASER’s is to completely block the eyes and use cameras. They’re automatically safe, by design. Worst you can do is output all white pixels which is of course within the limit of the screen.

Grammar Pedant October 7, 2015 11:32 AM

@Bruce and many other technical writers:
Walk by a policeman, and she will know your name, …

May I suggest “they”, as explained in https://en.wikipedia.org/wiki/Singular_they. No need for positive discrimination. 🙂

I.e.: Walk by a policeman, and they will know your name, …

Similarly in the article before:
… if your HR or legal person needs to get involved, she has to be able to use it without any training.

could instead be:
… if your HR or legal person needs to get involved, they have to be able to use it without any training.

Wael October 7, 2015 12:06 PM

@Grammar Pedant,

Walk by a policeman, and they will know your name, …

Two things:
1- “Policeman” is masculine, regardless of generalizations.
2- A “policeman” is singular; “they” isn’t appropriate.

The pronoun should match its antecedent! Thus, the sentence should be: Walk by police and they will…

ianf October 7, 2015 4:21 PM

@ Chris, this is one daft move:
Mastercard is going to use FR for payment authorization

If I decoded this corporate puff piece correctly, MC aims to simplify the validation process in order to lower the current >50% transaction abandonment rate, due to online shoppers forgetting passwords & pin codes, which leads to “huge dropout for merchants” (while creating cash-cow opportunities for others, but never mind ;-))

So they expect people to selfie themselves into their smartphone app within perhaps 30 seconds after the BUY button has been clicked. This is supposed to be 100% dependable, and work right out every single time. That’ll be the day. Also sounds like an opt-in choice & they’ll still have to support the “legacy” passworded method. Ergo #fuggedaboutit. Next invention please.

ADMINISTRIVIA @ tyr

Misaddressed, happens to the best of people.

@ianf
They were put in place by the same bunch who think that static loading is the way to test a structure. Dynamic load testing is done by the real engineers

JakeL October 8, 2015 10:17 AM

Without meaningful regulation, we’re moving into a world where governments and corporations will be able to identify people both in real time and backwards in time, remotely and in secret, without consent or recourse.

Not only that, they will also be able to frame individuals for crime never committed, if the need so arises (e.g. for a false flag attack if nothing else).

Michael October 9, 2015 1:14 AM

“Because if Peter can not do something but Paul can what is to stop Paul passing the results back to Peter”

What is called deterence. There is no stopping a narc playing the role of Peter to pass Pamela off as Paul thereby recollecting her unauthorized act.

Chris Pugson October 9, 2015 5:31 AM

This sounds like a gift to criminals. It raises possibilities for social engineering to new and unimagined levels.

PJ October 10, 2015 11:08 PM

Good luck getting the players in this to be responsible and discrete.

I suspect this will all be washed away when the revolution happens. Coming soon to a country near you…

TRX October 12, 2015 5:53 PM

The problem with using facial recognition as an ID is that there are fewer individual faces than there are people.

I have an identical twin. We share no genetic heritage I know of, but friends can’t tell pictures of us apart. We even have the same hairstyle. The twin? Hassan Nasrullah, leader of the Hezbollah terrorist group.

Visiting airports and Federal buildings is already miserable; getting a false positive as a terrorist wouldn’t improve things any.

thevoid October 15, 2015 7:46 AM

@TRX

The problem with using facial recognition as an ID is that there are fewer
individual faces than there are people.

this is perhaps not surprising, given the expansion of the human population
in the last 10,000 years. there are really no new genes, but there are MANY
more people with those same genes. for instance, the population of Britian
was approx 3mil 500 years ago, now it’s 60mil. so there are 20x as many
people with the same genes.

I have an identical twin. We share no genetic heritage I know of, but friends
can’t tell pictures of us apart. We even have the same hairstyle. The twin?
Hassan Nasrullah, leader of the Hezbollah terrorist group.

ha! not surprising though. there are many european genes floating around the
middle east. today when people think ‘mediterranian’ they think swarthy, but
that was not the case 2000 years ago. lots of red hair there then. but there
is also the fact of the medieval slave (that is, Slav) trade. even some Irish
captured by vikings ended up there. those facts being more recent, Clive
summed up more ancient geneflow recently:

But it’s even broader than you might think from that, since before 4000BC the
middle east has been the hub of trading and thus a melting pot of ethnicity so
it includes just about every ethnicity outside of the indigenous north and
south American continents and those of the Pacific and Austrilation areas.

i recently saw a picture of Mullah Omar (leader of the Taliban). it was black
and white, but fairly detailed. he also looked like me! not quite a twin (like
in my anecdote above), but maybe a brother. without beards maybe we would look
more different, but with them… and most people would call me german.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.