Scientist Says Unsettling Things
About You Can Be Learnt from Your Face Only
17/8/2024,
“Given the widespread use of facial recognition, our findings have critical implications for the protection of privacy and civil liberties.”
According to Business Insider, Michal Kosinski, a psychologist at Stanford University, has developed artificial intelligence (AI) that, by simply analysing a person’s face, can accurately determine their IQ, sexual preferences, and political inclinations.
Kosinski’s work raises a lot of ethical issues. Is phrenology, a pseudoscience that gained popularity in the 18th and 19th centuries that looked for connections between a person’s mental characteristics and their facial features, simply a high-tech version of this kind of facial recognition research?
Without a doubt, Kosinski stated to Business Insider. If anything, he claims that his work on facial recognition serves as a warning to legislators about the possible risks associated with his and other researchers’ comparable studies.
For instance, Kosinski developed a facial recognition algorithm in a paper he released in 2021 that was able to identify a person’s political ideas with 72% accuracy simply by scanning a picture of their face, compared to a human’s accuracy rate of 55%.
“Given the widespread use of facial recognition, our findings have critical implications for the protection of privacy and civil liberties,” the researcher stated in the paper.
Kosinski claims that his study should be interpreted as a warning, but his creations sometimes have the feel of a Pandora’s Box.
His research looks to have a lot of terrible application cases, and just writing about them could lead to the creation of new discriminatory instruments.
In addition, there’s the problem of the models’ imperfect accuracy, which might cause individuals to be incorrectly targeted.
For example, the Human Rights Campaign and GLAAD deemed research “dangerous and flawed” when Kosinski co-published a 2017 article about a facial recognition technology that could predict sexual orientation with 91 per cent accuracy because it may be used to discriminate against queer people.
That kind of technology combined with intense cultural conflicts, such as the misgendering of Olympic competitors this summer, may be disastrous.
There are already many instances in the real world where facial recognition has been used to violate people’s rights and lives.
For example, Rite Aid falsely accused minorities of shoplifting, and Macy’s falsely accused a guy of committing a violent robbery that he did not do.
Therefore, Kosinski’s research may be published to serve as a warning, but it also has the impression of providing thorough instructions to would-be home invaders.
Discover more from
Subscribe to get the latest posts sent to your email.