AIAAIC's "External Harm Taxonomy" describes harms (definition) as negative impacts on third parties caused by the use, misuse, or poor governance of AI and automated systems.
The taxonomy employs a user/victim-centric approach to categorising and defining harms, and is intended to help researchers, civil society organisations, journalists, students, users, and others to identify and understand the harms caused by AI, algorithmic and automation systems, and to resist and hold the individuals and/or organisations responsible for them accountable where warranted.
The taxonomy is applied to the AIAAIC Repository (web, sheet) and aims to present harms to third parties in a clear, succinct, understandable and usable manner.
It is available to third-parties to download, comment upon, update, and re-use in line with AIAAIC's terms of use.
🚩 AIAAIC's External Harm Taxonomy is being updated. Researchers, civil society organisations and others interested in participating in its development should contact us.
The deliberate or negligent impact(s) of a system on individuals or small groups of people using it or exposed to its misuse, including:
Anxiety/distress. Distress as a result of negative online experiences, social interactions etc.
Autonomy/agency loss. Loss of an individual, group or organisation’s ability to make informed decisions or pursue goals.
Benefits/entitlements loss. Denial or or loss of access to welfare benefits, pensions, housing, etc due to the malfunction, use or misuse of a technology system.
Bodily injury. Physical pain, injury, illness, or disease suffered by an individual or group due to the malfunction, use or misuse of a technology system.
Cheating/plagiarism. Use of another person’s or group’s words or ideas without consent and/or acknowledgement.
Chilling effect. The creation of a climate of self-censorship that deters democratic actors such as journalists, advocates and judges from speaking out.
Confidentiality loss. Unauthorised sharing of sensitive, confidential information and documents such as corporate strategy and financial plans with third-parties.
Creativity loss. Devaluation and/or deterioration of human creativity, artistic expression, imagination, critical thinking or problem-solving skills.
Deception/manipulation. The use of a technology system to deliberately manipulate, mislead or induce people into altering their thinking and behaviour.
Defamation. The use of a technology system to create, facilitate or amplify false perception(s) about an individual, group or organisation.
Dehumanisation/objectification. Use or misuse of a technology system to depict and/or treat people as not human, less than human, or as objects.
Desensitisation. The psychological process by which a person’s emotional, physiological, or behavioral response to a stimulus decreases after repeated exposure to it.
Dignity loss. Perceived loss of value experienced by or disrespect shown to an individual or group, resulting in self-sheltering, loss of connections and relationships, and public stigmatisation.
Discrimination. Unfair or inadequate treatment or arbitrary distinction based on a person's race, ethnicity, age, gender, sexual preference, religion, national origin, marital status, disability, language, or other protected groups.
Financial loss. The loss of money, income or value due to the use or misuse of a technology system.
Fraud. The intentional deception to gain an advantage or cause a loss to another, violating the victim's autonomy and right to informed consent.
Harassment. Online behaviour such as sexual harassment that makes an individual or group feel alarmed or threatened.
Health deterioration. Physical deterioration of an individual or animal over time in the form of disease, organ failure, prolonged hospital stay or death, etc.
Identity theft. The unauthorised use of another person's personal or financial information, such as their name, Social Security number, credit card details, or address, to commit fraud or other crimes, often for financial gain.
Increased workload. A situation in which the volume, complexity, or pace of tasks and responsibilities assigned to an individual or team increases
IP/copyright loss. The misuse of an individual or organisation’s intellectual property, including copyright, trademarks and patents
Intimidation. The act of making someone fearful or timid through threats, coercion, or aggressive behavior, often to compel compliance or deter action.
Isolation. The state of being separated from others or the act of separating something from its surrounding.
Job loss/losses. Replacement/displacement of human jobs by a technology system or set of systems, leading to increased unemployment, inequality, reduced consumer spending and social friction.
Loss of rights/freedoms. The legal or practical deprivation of inherent human entitlements, such as liberty, privacy, voting, or property ownership due to the use or misuse of a technology system.
Loss of life. Accidental or deliberate loss of life, including suicide, extinction or cessation, due to the use or misuse of a technology system.
Marginalisation. The process of pushing individuals or groups to the edges of society, limiting their access to resources, power, and opportunities due to systemic discrimination or exclusion.
Misrepresentation. The act of giving false, incorrect, or misleading information about something or someone, often to deceive or influence another's decision.
Opportunity loss. Loss of ability to take advantage of a financial or other opportunity, such as education, immigration, employability/securing a job
Personality rights loss. Loss of or restrictions to the rights of an individual to control the commercial use of their identity, such as name, image, likeness, or other unequivocal identifiers.
Privacy loss. Unwarranted exposure of an individual's private information or unwarranted processing of personal data.
Property damage. Action(s) that lead directly or indirectly to the damage or destruction of tangible property eg. buildings, possessions, vehicles, robots.
Radicalisation. The process through which an individual or group adopts extreme political, social, or religious beliefs and attitudes that challenge or reject the status quo, often leading to support for or engagement in actions that may be harmful, disruptive, or violent.
Reputational damage. The use of misuse of a technology system that leads directly or indirectly to the loss of confidence of or trust in a third-party
Service quality erosion. The interruption, suspension, degradation, or significant delay in the provision of services due to system technical failures
Sexualisation. The non-consensual sexualisation of an individual or group using a technology or application.
Stereotyping. Derogatory or otherwise harmful stereotyping or homogenisation of individuals, groups, societies or cultures due to the mis-representation, over-representation, under-representation, or non-representation of specific identities, groups or perspectives.
Stigmatisation. The process of labeling individuals or groups with negative stereotypes, leading to their social disapproval, exclusion, or marginalisation.
Trauma. Severe and lasting emotional shock and pain caused by an extremely upsetting experience involving a technology system or application.
Trust loss. The erosion or complete diminishment of confidence, belief, or reliance in a person or organisation.
The impact of a system on an organisation, physical community, or society, including those detailed under Individual (above) as well as:
Critical infrastructure damage. Damage, disruption to or destruction of systems essential to the functioning and safety of a nation or state, including energy, transport, health, finance and communication systems.
Damage to national security. Harm to the national defence or foreign relations of a country resulting from the unauthorised disclosure of classified information.
Damage to public health. Adverse impacts on the health of groups, communities or societies, including malnutrition, disease and infection conditions.
Historical revisionism. Reinterpretation of established/orthodox historical events or accounts held by societies, communities or academics that prompts controversy.
Increased utility costs. Higher electricity, water and other services as a result of the use or misuse of a system, or the hardware used to power a system.
Information ecosystem degradation. The systemic decline in the quality, reliability, and trustworthiness of the collective information environment, making it harder to distinguish truth from falsehood.
Institutional trust loss. The erosion of trust in public institutions and systems and weakened checks and balances due to mis/disinformation, influence operations, overdependence on technology, etc.
Loss of community wellbeing/cohesion. The erosion of social bonds, trust, and mutual support within a community, leading to diminished collective health, resilience, and shared identity.
Market distortion/monopolisation. Abuse of market power through the control of prices, thereby limiting competition and creating unfair barriers to entry.
Market value loss. The reduction in the amount an asset could be sold for on the open market compared with its previous or undamaged value.
Political instability. Political unrest caused directly or indirectly by the use or misuse of a technology system.
Operational disruption. An unforeseen event or condition that interrupts, impairs, or halts normal business, industrial, or organisational processes, leading to reduced performance, downtime, or failure to deliver services.
Productivity loss. A reduction in the output or performance of a worker, organisation, system, or economy relative to its potential or expected level.
Resource diversion. The diversion of public, private, or commercial money away from the people who need it.
Societal destabilisation. Societal instability in the form of strikes, demonstrations and other types of civil unrest caused by loss of jobs to technology, unfair algorithmic outcomes, disinformation, etc.
The impact of a system on the environment, including:
Air pollution. The contamination of the air by harmful substances that change its natural composition and make it unsafe or unpleasant for living beings and the environment.
Ecological/biodiversity loss. Deforestation, habitat destruction and the fragmentation and loss of biodiversity due to the over-expansion of technology infrastructure, or inadequate alignment of technology with sustainable practices.
Excessive carbon emissions. The release of carbon dioxide, nitric oxide and other gases, increasing carbon emissions, exacerbating climate change, and negatively impacting local communities.
Excessive energy shortages. Excessive energy use resulting in energy bottlenecks and shortages for communities, organisations and businesses.
Excessive water shortages. Excessive use of water to cool data centres and for other purposes, leading to water restrictions or shortages for local communities or businesses.
Ground pollution. The contamination of soil and subsurface by harmful substances that damage ecosystems and human health.
Noise pollution. An unwanted or unwarranted sound that can disturb human and animal health and wellbeing.
Water pollution. The contamination of natural water bodies by substances or forms of energy that make the water unsafe for living organisms or human use.