Human rights organizations are expressing support for the U.S. government’s plans to introduce new export controls on facial recognition systems and recommending additional controls on remote biometric identification. The recommendations were submitted this week by ten groups, including Freedom House, Access Now, Amnesty International and Human Rights Watch.
In July, the U.S. Department of Commerce’s Bureau of Security and Industry (BIS) proposed amendments introducing two new item controls for facial recognition to the Commerce Control List (CCL). The new rules would create controls for facial recognition systems that are designed for mass surveillance and crowd scanning and give foreign governments the ability to monitor, track or detain people unlawfully.
In their comments to the government agency, the human rights groups note that the proposed rules are “significant for their ability to better protect human rights around the world.” The U.S. government, however, should go beyond these controls and limit technologies that track individuals using their eyes, gait, voice, personal appearance, or any other biometric identifier that can be used for mass surveillance.
Another recommendation submitted by the groups is ensuring that certain countries do not fall through the cracks. Kyrgyzstan, for instance, is not among the countries identified on the designated list for export controls, even though its government has recently signed a data-sharing agreement with the Russian government. The agreement could help Moscow identify Russian conscripts who escaped military service or anti-war activists.
Amnesty: Camera surveillance is having a chilling effect on protesters
The Netherlands’ use of digital surveillance during demonstrations is having a chilling effect on protestors while the use of the technology lacks transparency, according to a new report released by human rights organization Amnesty International.
The Dutch police have been using drones, video cars and bodycams to surveil protests. At the same time, however, little is known about what is happening to the images that are being stored in police databases, says Dagmar Oudshoorn, the organization’s director for the Netherlands.
“Camera surveillance is being deployed because protests are being perceived as a security risk rather than a fundamental right and a vital part of a healthy society,” says Oudshoorn. “All facial recognition technology for identification purposes should be banned and clear rules for police surveillance at protests must be established.”
The group also added that Dutch regulations lack detail on what police officers may or may not do when surveilling protests.
In September, the Dutch data watchdog slapped a fine on U.S.-based facial recognition provider Clearview AI for illegally collecting and processing the data of Dutch people. The Dutch Data Protection Authority (DPA) chair Aleid Wolfsen, however, expressed support for police use of facial recognition as they “have to manage the software and database themselves in that case, subject to strict conditions and under the watchful eye of the Dutch DPA and other supervisory authorities.”
Article Topics
Amnesty International | biometric identification | biometrics | data privacy | facial recognition | video surveillance