What Most Alarms Portland City Officials About Facial Recognition Software Are the Faces It Can’t Recognize

The city’s top reason for issuing the ban: The software is racist.

A racial justice rally on the Portland waterfront in July 2020. (Joseph Blake, Jr.)

WW presents "Distant Voices," a daily video interview for the era of social distancing. Our reporters are asking Portlanders what they're doing during quarantine.

In Portland, Big Brother can't recognize you. It's illegal.

On Sept. 9, Portland passed the first ban in the nation on corporate use of facial recognition software. The Portland City Council passed two bans: One stops city government from using such software, and the other bars private companies from scanning faces in public places.

The city's top reason for issuing the ban: The software is racist.

As Hector Dominguez explains, the programs that analyze faces haven't been given enough examples of women and people of color to sufficiently distinguish them. In effect, the algorithm behaves like a white security guard who thinks Black people look alike. Racial bias was built into the software by not giving it enough information.

Hector Dominguez. (City of Portland)

Dominguez, the city's open data coordinator, says that problem—which would lead to false identifications of people of color—alarmed city officials enough to trigger an outright ban.

In an interview with WW editor and publisher Mark Zusman, Dominguez discusses the nature of that concern, how the ban will be enforced, and why Amazon stopped by City Hall to argue for less regulation.

Willamette Week’s reporting has concrete impacts that change laws, force action from civic leaders, and drive compromised politicians from public office. Support WW's journalism today.