Faception facial personality profiling
Report incident 🔥 | Improve page 💁 | Access database 🔢
Founded in 2014, Faception is a Tel Aviv-based company that uses computer vision and machine learning to analyse facial images and make real-time predictions about an individual's personality, behaviour, and character traits.
The technology has been touted as a revolutionary tool for applications including security, marketing, and human resources.
System 🤖
System info 🔢
Operator: Faception
Developer: Faception
Country: Israel
Sector: Business/professional services; Banking/financial services; Govt - police
Purpose: Identify personality type; Predict behaviour
Technology: Computer vision; Behavioural analysis; Emotion recognition; Facial recognition; Personality analysis; Machine learning
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity, gender; Ethics
Transparency: Governance; Black box
Risks and harms 🛑
Faception has been criticised for potential inaccuracies, biases, and ethical concerns, as it claims to predict complex personality traits and behaviours such as identifying terrorists or paedophiles, potentially leading to false positives and unjust profiling.
The company's technology is based on Western cultural and social norms, which may not be applicable in other cultural contexts, and it has not demonstrated an understanding of the potential cultural and social implications of its technology, potentially leading to misinterpretation or misapplication of the results.
Faception's claims about the accuracy of its technology are based on largely unproven scientific theories about the relationship between facial features and personality traits. The scientific community has raised concerns about the validity of these theories and there is limited research to support the idea that facial features can be used to accurately predict personality or behaviour.
Transparency and accountability 🙈
Faception is seen to suffer from several important transparency and accountability issues:
Algorithmic decision-making processes. Faception's algorithm is proprietary, and the company has not disclosed the exact methods used to analyse facial features and make predictions. This lack of transparency makes it difficult to understand how the technology arrives at its conclusions, which can lead to biased or discriminatory outcomes.
Errors or misclassifications. Faception's technology is liable to errors or misclassifications can occur. However, the company has not established clear procedures for addressing these errors or providing recourse for individuals who are misclassified.
Protection of sensitive information. Faception collects and analyses sensitive biometric data, including facial images. However, the company's data protection policies and procedures are not transparent, and it is unclear how this sensitive information is stored, shared, or protected.
Regulation and oversight. Faception operates in a regulatory gray area, and there is limited oversight of its activities. This lack of regulation can lead to unchecked use of the technology, potentially resulting in harm to individuals or groups.
Deployment. Faception has not been transparent about how its technology is being used in various contexts, including law enforcement, border control, or employment screening. This lack of transparency makes it difficult to assess the potential risks and benefits of the technology.
Incidents and issues 🔥
Research, advocacy 👩🏼⚖️
Page info
Type: System
Published: March 2023
Last updated: August 2024