Glossary
Algorithm Review
The process of systematically examining the design, logic, and performance of an algorithm to identify and correct potential biases or unfair outcomes.
Example:
Before deploying a new credit scoring system, a team conducts an algorithm review to ensure it doesn't unfairly penalize applicants from certain zip codes, even if those zip codes historically have lower credit scores.
Bias (in computing)
A tendency or inclination, often unfair or prejudicial, that can be reflected in computing innovations due to biased data or design choices.
Example:
An online search engine showing only male engineers in image results, even when searching for 'engineers,' demonstrates a potential bias in its underlying data or algorithm.
Criminal Risk Assessment Tools
Software applications used in the justice system to predict the likelihood of a defendant re-offending, often based on historical crime data.
Example:
A judge using a criminal risk assessment tool might see a higher predicted re-offense rate for a defendant from a certain neighborhood, even if their individual history doesn't warrant it, due to historical biases in the training data.
Diverse Data Sets
Collections of information used to train computing models that accurately reflect the variety of characteristics, demographics, and experiences present in the real world.
Example:
To ensure a new language translation app works well for everyone, developers should use diverse data sets that include speech patterns and accents from many different regions and age groups.
Facial Recognition Systems
Technology that identifies or verifies a person from a digital image or a video frame by analyzing patterns based on the person's facial features.
Example:
A facial recognition system failing to accurately identify a person with darker skin tones because its training data primarily consisted of lighter skin tones is a clear example of algorithmic bias.
Fairness Metrics
Quantitative measures used to evaluate whether an algorithm's outcomes are equitable across different demographic groups, helping to ensure non-discriminatory results.
Example:
Developers might use fairness metrics to check if their loan approval algorithm grants loans at a similar rate to qualified applicants regardless of their gender or ethnicity.
Human Bias
Preconceived notions, tendencies, or inclinations held by individuals that can unintentionally influence the design, development, or interpretation of computing systems.
Example:
A programmer, due to their own human bias, might unconsciously design a user interface that is less intuitive for older adults because they primarily test it with younger users.
Recruiting Algorithms
Automated tools that analyze job applications and candidate profiles to identify suitable candidates, often based on patterns from past successful hires.
Example:
An AI-powered recruiting algorithm might inadvertently filter out resumes containing words associated with traditionally female roles, even if the skills are transferable, because its training data favored male candidates.
Tech Diversity
The presence of a wide range of demographic characteristics, experiences, and perspectives within the technology workforce.
Example:
A company actively promoting tech diversity by hiring engineers from various backgrounds and cultures is more likely to build products that are inclusive and less prone to bias.