When you go to your local DMV office and stand in front of the camera to take as decent a photo as possible, you probably arenât thinking about the possibility that said photo will come up in a facial-recognition search used by government agencies. If this is a concern you never had before, you should definitely have it now.
The Washington Post was provided âthousands of facial-recognition requests, internal documents and emails over the past five years, obtained through public-records requestsâ from researchers at Georgetown Lawâs Center on Privacy and Technology, and what they reveal is stunning.
Suggested Reading
Agents from the FBI as well as Immigration and Customs Enforcement have used state driverâs license databases to scan through âmillions of Americansâ photos without their knowledge or consent.â
From the Post:
Police have long had access to fingerprints, DNA and other âbiometric dataâ taken from criminal suspects. But the DMV records contain the photos of a vast majority of a stateâs residents, most of whom have never been charged with a crime.
And that is the crux of the problem. This particular use of driverâs license photos for facial-recognition searchesâwhich can lead to dangerous and damaging errorsâis being done without the consent of the people, something that members of Congress have spoken out about, including the two highest-ranking members of the House Oversight Committee.
The Post reports that Rep. Jim Jordan (R-Ohio) had concerns that this is happening without any type of consent or buy-in from state lawmakers or the individuals whose license pictures are being used.
During a hearing last month, Jordan said âTheyâve just given access to that to the FBI. No individual signed off on that when they renewed their driverâs license, got their driverâs licenses. They didnât sign any waiver saying, âOh, itâs okay to turn my information, my photo, over to the FBI.â No elected officials voted for that to happen.â
Similarly, Rep. Elijah Cummings (D-Md.)âchairman of the House Oversight Committeeâexpressed similar concerns about consent.
In a statement to the Post, Cummings said âLaw enforcementâs access of state databases,â particularly DMV databases, is âoften done in the shadows with no consent.â
But it really is about more than consent.
As reported in a July 8 New York Times article, the city of Detroit started an initiative called Project Green Light in 2016 with the aim of curbing crime in the city. As such, thousands of cameras monitor âgas stations, restaurants, mini-marts, apartment buildings, churches and schoolsâ 24 hours a day and stream those images directly to police department headquarters. Detroit Mayor Mike Duggan has promised to expand the network of cameras âto include several hundred traffic light cameras would allow the police to âtrack any shooter or carjacker across the city.ââ
Detroit has received pushback from the public recently because Project Green Light includes a software tool that can suggest the identity of the people captured on its cameras.
From the Times:
The facial recognition program matches the faces picked up across the city against 50 million driverâs license photographs and mug shots contained in a Michigan police database. The practice has attracted public attention recently as the department seeks approval for a formal policy governing its use from a civilian oversight board.
âPlease, facial recognition softwareâthatâs too far,â pleaded one resident at a recent meeting of the board.
The problem?
Facial-recognition software tends to be skewed when it comes to people of color.
âFacial recognition software proves to be less accurate at identifying people with darker pigmentation,â George Byers II, a black software engineer, told Detroitâs police board last month. âWe live in a major black city. Thatâs a problem.â
From the Times:
Researchers at the Massachusetts Institute of Technology reported in January that facial recognition software marketed by Amazon misidentified darker-skinned women as men 31 percent of the time. Others have shown that algorithms used in facial recognition return false matches at a higher rate for African-Americans than white people unless explicitly recalibrated for a black populationâin which case their failure rate at finding positive matches for white people climbs. That study, posted in May by computer scientists at the Florida Institute of Technology and the University of Notre Dame, suggests that a single algorithm cannot be applied to both groups with equal accuracy.
The potential is there for false identifications that can lead to troubling results including innocent people being misidentified and arrestedâand that can create a domino effect of additional problems.
According to the Post, the Government Accountability Office said last month that the FBI has done more than 390,000 facial recognition searches since 2011, including searches through federal, local and DMV databases.
Itâs all a creepy sign that we are moving closer and closer to the Big Brother type of government oversight that Orwell warned us about. That weird future we were worried about is already here, and itâs not going anywhere any time soon.
Jake Laperruque, a senior counsel at the watchdog group Project on Government Oversight, told the Post: âItâs really a surveillance-first, ask-permission-later system. People think this is something coming way off in the future, but these [facial-recognition] searches are happening very frequently today. The FBI alone does 4,000 searches every month, and a lot of them go through state DMVs.â
From the Post:
The FBIâs facial-recognition search has access to local, state and federal databases containing more than 641 million face photos, a GAO director said last month. But the agency provides little information about when the searches are used, who is targeted and how often searches return false matches.
The FBI said its system is 86 percent accurate at finding the right person if a search is able to generate a list of 50 possible matches, according to the GAO. But the FBI has not tested its systemâs accuracy under conditions that are closer to normal, such as when a facial search returns only a few possible matches.
As private citizens, we donât have a way to control the information the government keeps on us.
But now they have a problematic way of tracking our every move and tying us to events and situations they think we are involved in.
Even if it might not necessarily be âus.â
Straight From
Sign up for our free daily newsletter.