zuai-logo
zuai-logo
  1. AP Pre Calculus
FlashcardFlashcardStudy GuideStudy GuideQuestion BankQuestion BankGlossaryGlossary

Glossary

C

Contextual Scenarios

Criticality: 2

Real-world situations or problems where mathematical models, like matrices, are used to represent and analyze relationships or changes over time.

Example:

Modeling the flow of traffic between different intersections in a city to optimize signal timing is a contextual scenario for matrices.

F

Future Probabilities

Criticality: 3

The likelihood of a system being in a particular state after a specified number of time intervals, calculated by raising the transition matrix to that power.

Example:

Calculating the future probabilities of a stock being 'up' or 'down' after three trading days involves raising the daily transition matrix to the third power.

I

Identity Matrix

Criticality: 1

A square matrix with ones on the main diagonal and zeros elsewhere, which acts like the number '1' in matrix multiplication (A * I = A).

Example:

Multiplying any 2x2 matrix by the 2x2 identity matrix egin{bmatrix} 1 & 0 \ 0 & 1 \end{bmatrix} leaves the original matrix unchanged.

Inverse Matrix

Criticality: 2

For a square matrix A, its inverse A⁻¹ is a matrix that, when multiplied by the original, results in the identity matrix.

Example:

If you apply a transformation represented by a matrix to an image, the inverse matrix can be used to perfectly undo that transformation and restore the original image.

M

Markov Chains

Criticality: 3

A mathematical model describing a sequence of possible events where the probability of each event depends only on the state attained in the previous event.

Example:

Predicting weather patterns where tomorrow's weather depends only on today's weather (e.g., sunny or cloudy) is a classic Markov Chain application.

Matrices

Criticality: 3

Rectangular arrays of numbers used to organize and manipulate data, often representing systems or transformations in various contexts.

Example:

A matrix could represent the inventory of different shirt sizes and colors in a store, with rows for sizes and columns for colors.

P

Predicting Future States

Criticality: 3

The process of determining the distribution of a system's states at a later time by multiplying the transition matrix by the current state vector.

Example:

Using a matrix to predict future states of a disease's spread based on current infection rates and recovery probabilities.

Predicting Past States

Criticality: 2

The process of determining the distribution of a system's states at an earlier time by multiplying the inverse of the transition matrix by the current state vector.

Example:

Using an inverse matrix to predict past states of a population's migration patterns, given their current locations and historical transition probabilities.

Predicting Steady States

Criticality: 3

The method of finding the long-term, unchanging distribution of states by repeatedly multiplying the transition matrix by a state vector until the vector converges.

Example:

To find the predicting steady states of customer preferences for two competing brands, you might simulate their choices over many cycles until the proportions stabilize.

R

Rates of Change

Criticality: 2

A measure of how one quantity changes in relation to another, often over time, indicating increase, decrease, or transition between states.

Example:

The percentage of students switching from one elective to another each semester represents a rate of change that can be modeled.

S

State Vector

Criticality: 3

A column vector that represents the probabilities or proportions of a system being in each of its defined states at a specific point in time.

Example:

A state vector could show that 70% of a population lives in urban areas and 30% in rural areas at the current moment.

States (in Markov Chains)

Criticality: 3

The distinct conditions or categories a system can be in at any given time within a Markov Chain model.

Example:

In a model of customer loyalty for a coffee shop, 'regular customer' and 'occasional customer' could be two states.

Steady State

Criticality: 3

A stable distribution of states in a system modeled by a Markov Chain, where the probabilities of being in each state no longer change over time.

Example:

The steady state of a rental car company might show the long-term percentage of cars consistently located at each airport, regardless of the initial distribution.

T

Transition Matrix

Criticality: 3

A square matrix where each entry represents the probability of moving from one state to another in a Markov Chain over a single time interval.

Example:

A transition matrix might show the probabilities of a student moving from 'freshman' to 'sophomore' or 'dropping out' each year.