European reasons for prohibiting biometric surveillance

[ad_1]

Your body is Data gold mine. From your appearance to your thoughts and feelings, companies working in the booming biometrics industry are developing new and shocking ways to track everything we do. And, in many cases, you may not even know that you are being tracked.

But the biometric business is in conflict with Europe’s leading data protection experts. The European Data Protection Supervisory Agency, which is the independent data agency of the European Union, and the European Data Protection Commission, which helps countries to implement GDPR consistently, have called for a total ban on the use of artificial intelligence to automatically identify personnel.

“The deployment of remote biometrics in publicly accessible spaces means the anonymity of these places is over,” the heads of the two agencies Andrea Jelinek and Wojciech Wiewiórowski wrote in a statement. Joint Statement At the end of June. They said that artificial intelligence should not be used for facial recognition, gait recognition, fingerprints, DNA, voice, keystrokes, and other types of biometric recognition in public places. It should also prohibit attempts to use artificial intelligence to predict people’s race, gender, politics, or sexual orientation.

But such calls run counter to the artificial intelligence regulations proposed by the European Union.These rules are Announced in April, Saying that “remote biometrics” is high-risk—meaning they are allowed but face stricter controls than other uses of artificial intelligence. Politicians in the European Union will spend years debating the rules of artificial intelligence, and biometric monitoring has become one of the most controversial issues. After the regulations are passed, they will define how to monitor hundreds of millions of people in the coming decades. The debate now begins.

Facial recognition has been controversial for many years, but the real biometric craze is targeting other parts of your body. In the 27 member states of the European Union, many companies have been developing and deploying biometric technologies. In some cases, these technologies are designed to predict people’s gender and ethnicity and recognize their emotions. In many cases, this technology has already been used in the real world. However, the use of artificial intelligence for these classifications is questionable both scientifically and ethically. Such technologies run the risk of infringing on people’s privacy or automatically discriminating against others.

Take Herta Security and VisionLabs as examples. Both companies have developed facial recognition technology for multiple purposes and said that law enforcement, retail and transportation industries can all deploy the technology. Documents from Herta Security in Barcelona, Claim its customers Including the police forces of Germany, Spain, Uruguay, and Colombia, as well as airports, casinos, stadiums, shopping centers and hotel chains such as Marriott and Holiday Inn.

Critics point out that both Herta Security and VisionLabs claim that parts of their systems can be used to track sensitive attributes. “Many systems, even those used to identify people, rely on these potentially very harmful classifications and classifications as the basic logic,” said Ella Jakubowska, a policy consultant for the European Digital Rights Advocacy Group who studies biometrics. The organization is striving to ban biometric surveillance throughout Europe.

Herta Security’s facial analysis tool BioMarketing is advertised as a way for stores and advertisers to understand customers. It can “extract” everything from a person’s age and gender to whether they wear glasses, and even track their facial expressions. Herta Security said the technology is “ideal” for developing targeted advertising or helping companies understand their customers. Herta Security claims that the tool can also classify people by “race.” According to the GDPR, personal data revealing “race or ethnic origin” is considered sensitive data, and its use is strictly controlled.

Jakubowska said she questioned the CEO of Herta Security about the use of race issues last year, and since then, the company has removed this claim from its marketing materials. It is not clear whether this feature has been removed from the tool itself.company documents Hosted by Third party Race is still listed as one of the characteristics that can be discovered using BioMarketing.company documents Since 2013 Mentioned that it detects “races” before updating these to races. Herta Security has received more than 500,000 euros of EU funding and received the EU seal of excellence, but did not respond to a request for comment.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker