Glossary
Adding or Subtracting a Constant
A linear transformation where a constant value is added to or subtracted from every value of a random variable, shifting its center but not its spread or shape.
Example:
If a teacher adds 5 bonus points to everyone's test score (X), the new score Y = X + 5 is an example of adding a constant.
Center
A measure of the typical or central value of a distribution, often represented by the mean or median.
Example:
The mean number of hours students study per week is a measure of the center of the study time distribution.
Combining Random Variables
The process of creating a new random variable by adding or subtracting two or more existing random variables.
Example:
If X is the time to complete task A and Y is the time to complete task B, then X + Y represents combining random variables to find the total time.
Expected Value of the Sum/Difference of Two Random Variables
The rule stating that the mean of a sum or difference of random variables is simply the sum or difference of their individual means.
Example:
If the average time to bake a cake is 45 minutes and the average time to frost it is 15 minutes, the expected value of the sum (total time) is 45 + 15 = 60 minutes.
Independence (of random variables)
A condition where the outcome of one random variable does not affect the outcome of another random variable.
Example:
The number of heads you get on one coin flip is independent of the number of heads you get on a second coin flip.
Linear Transformations of a Random Variable
Operations that change a random variable by adding/subtracting a constant or multiplying/dividing by a constant, affecting its center and/or spread.
Example:
If the temperature in Celsius (C) is a random variable, converting it to Fahrenheit (F = 1.8C + 32) is a linear transformation.
Location
Refers to where the distribution is positioned on the number line, influenced by measures of center.
Example:
If all test scores are shifted up by 10 points, the location of the entire distribution of scores moves to the right.
Mean (Expected Value)
The average value of a random variable over many trials, calculated as the sum of each possible value multiplied by its probability.
Example:
The mean number of defective items produced per hour in a factory might be 3.5, indicating the average expectation.
Multiplying or Dividing by a Constant
A linear transformation where every value of a random variable is multiplied or divided by a constant, affecting both its center and spread but not its shape.
Example:
If a currency exchange rate changes, multiplying your foreign currency amount (X) by the new rate (e.g., Y = 0.85X) is an example of multiplying by a constant.
Shape
Describes the overall form of a distribution, such as symmetric, skewed, or bimodal.
Example:
A distribution of human weights might be slightly right-skewed, indicating its shape.
Spread
A measure of the variability or dispersion of data points in a distribution, often represented by standard deviation or IQR.
Example:
A large spread in student heights means there's a wide range from the shortest to the tallest student.
Standard Deviation
A measure of the typical distance of values in a distribution from the mean, indicating the variability or spread.
Example:
If the standard deviation of daily temperatures is 2 degrees, it means temperatures typically vary by about 2 degrees from the average.
Standard Deviation of the Sum/Difference of Two Random Variables
The rule for calculating the standard deviation of a sum or difference of independent random variables, which involves adding their variances and then taking the square root.
Example:
If the standard deviation of commute time by car is 5 min and by train is 3 min, the standard deviation of the difference in commute times is found by √(5² + 3²) = √34 minutes.
Variance
The square of the standard deviation, representing the average squared distance of each data point from the mean.
Example:
If the standard deviation of a stock's daily price change is 4, indicating the squared variability.