We must recognise that mind is not a single thing that beings have more or less of. There are many dimensions of mind: the “space of possible minds” (a concept first proposed in 1984 by computer scientist Aaron Sloman) has multiple coordinates, and we exist in some part of it, a cluster of data points that reflects our neurodiversity. We are no more at the centre of this mind-space than we are at the centre of the cosmos. So what else is out there?
53 blocks • 2 days ago
To sum it up
Despite emerging technologies that make face detection and recognition more and more sophisticated, there are still some issues with security, privacy, and accuracy. Solving these challenges through targeted interdisciplinary efforts among developers, researchers, and policymakers will be required to refine this technology and introduce it into new domains.
In 2020, the European Commission had considered a temporary ban (3-5 years) on facial recognition technologies in specific contexts, such as public places, in order to allow the development of new legislation to prevent privacy and ethical violations.
Today, however, this restriction seems no longer to exist, since on the occasion of the 40th Anniversary of Convention 108 (for the Protection of Individuals with regard to Automatic Processing of Personal Data), the Council of Europe published the Guidelines on the principles, guarantees and protections to which the signatory States are obliged to conform when drafting national legislation on this subject. Furthermore, the Council of Europe drew a parallel between the content of the Guidelines and the principles of the GDPR (General Data Protection Regulation), which prohibits the use of such technologies when they are unjustified and excessive.
FaceID: Friend or foe during the Covid-19 pandemic
When the pandemic broke out, facial recognition technologies adopted a wider use and they are now being used to track COVID-positive people who must stay home. A mobile app with facial recognition features invites a quarantining individual to take a selfie, then verifies his or her identity to ensure he/she follows the self-isolation rules. Despite rising privacy concerns, in South Korea, the government plans to take facial recognition-enabled tracking to the next level by launching a pilot aimed at tracking infected people, pinpointing who they communicate with, and stating whether they take precautionary measures to prevent the virus from spreading.
In addition, since the outbreak of COVID-19, facial recognition software providers have upgraded their algorithms with new features, such as the ability to recognise faces wearing masks.
However, having most of the face covered (only the forehead and eyes are visible) increases the risk of error.
One of the most serious dangers is that facial recognition can assess the personality traits, feelings, mental status, and work attitudes of people who are unaware of it. These technologies, known as "emotion recognition", could be misused for both political and economic purposes.
It should be noted that most companies also use FaceID to decide which employees to hire, to gain access to insurance services or as a selective criterion in educational strategies.
Emotion recognition technologies have recently been adopted in China as well, as a method to keep the population under surveillance. Following the mandatory facial recognition for anyone with a smartphone, Chinese authorities have also introduced emotion recognition. The purpose is to monitor and control people's moods as well.
The next disadvantage of face-recognition is linked to a topic close to the black community’s heart: techno-racism.
“It’s not just the physical streets. Black folks now have to fight the civil rights fight on the virtual streets, in those algorithmic streets, in those internet streets,” says W. Kamau Bell, host of the CNN original series “United Shades of America”.
Experts warn that government and private sector digital tools might unwittingly discriminate against people of color, making techno-racism a new and vital aspect of the civil rights battle.
This is demonstrated by the fact that Black Americans are more likely than White Americans to be arrested and imprisoned for minor crimes. As a result, they appear to be overrepresented in mug shot data, which is used by face-recognition software to identify suspects accused of committing crimes. This is everything but good news.
Furthermore, because the tools used by the law enforcement in the United States fail to precisely identify the faces of black people at night, they are required to carry a lantern to be recognized, resulting in an unfair racially disparate impact.