1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
Revise later
SpaceTo flip
If confident
All Flashcards
How do you construct a transition matrix from a given scenario?
Identify the states, determine the probabilities of transitioning between them, and arrange these probabilities in a matrix where columns sum to 1.
How do you predict the state vector after a certain number of transitions?
Raise the transition matrix to the power of the number of transitions, then multiply by the initial state vector.
How do you find the steady-state vector?
Repeatedly multiply the transition matrix by an initial state vector until the state vector converges, or solve the equation TX = X.
How do you predict a past state vector?
Find the inverse of the transition matrix, then multiply it by the current state vector.
How do you determine if a matrix is a valid transition matrix?
Check that all entries are non-negative and that the sum of entries in each column equals 1.
How do you set up a Markov chain model for a given problem?
Define the states, determine transition probabilities, create the transition matrix, and define the initial state vector.
How do you find the inverse of a 2x2 matrix?
For a matrix $$egin{bmatrix} a & b \ c & d end{bmatrix}$$, the inverse is $$frac{1}{ad-bc} egin{bmatrix} d & -b \ -c & a end{bmatrix}$$.
How do you multiply two matrices?
Multiply rows of the first matrix by columns of the second matrix, summing the products to get each entry in the resulting matrix.
How do you interpret the entries in a transition matrix?
Each entry represents the probability of transitioning from the state represented by the column to the state represented by the row.
How do you determine the long-term behavior of a system using a Markov chain?
Find the steady-state vector, which represents the distribution of states that the system will eventually converge to.
What is a transition matrix?
A matrix representing probabilities of moving between states in a system.
What is a state vector?
A column vector showing probabilities of being in each state at a specific time.
What is a Markov Chain?
A model that describes transitions between states, where the probability of each transition depends only on the current state.
What is steady state?
The distribution of states in a Markov chain that doesn't change over time.
What does the entry p(A→B) in a transition matrix represent?
The probability of moving from state A to state B.
What is the inverse of a matrix?
A matrix that, when multiplied by the original matrix, results in the identity matrix.
What is the identity matrix?
A square matrix with 1s on the main diagonal and 0s elsewhere.
What is the significance of the columns in a transition matrix?
Each column represents the probabilities of transitioning from a specific state to all other possible states.
What is the purpose of using matrices in modeling contexts?
To represent and analyze transitions between different states over time.
What is the difference between a transition matrix and a state vector?
A transition matrix describes how probabilities change, while a state vector describes the probabilities at a specific time.
What is the formula for a general 2x2 transition matrix?
$$\begin{bmatrix} p(A \rightarrow A) & p(A \rightarrow B) \\ p(B \rightarrow A) & p(B \rightarrow B) \end{bmatrix}$$
How to find the state vector after 'n' transitions?
Multiply the transition matrix T by itself 'n' times, then multiply by the initial state vector X: $$T^n * X$$
How do you find the previous state vector?
Multiply the inverse of the transition matrix by the current state vector: $$T^{-1} * X$$
What is the condition for a matrix to be a valid transition matrix?
The sum of entries in each column must equal 1, and all entries must be non-negative.
How do you find the state vector after one transition?
Multiply the transition matrix (T) by the current state vector (X): $$T * X$$
What is the formula to find the steady-state vector?
Repeatedly multiply the transition matrix by the state vector until the state vector converges. Alternatively, solve TX = X.
How do you calculate the state vector after two transitions?
Multiply the transition matrix by itself, then multiply the result by the initial state vector: $$T^2 * X$$
What is the formula to predict a future state?
Multiply the transition matrix by the current state vector: $$T * X$$
What formula do you use to predict a past state?
Multiply the inverse of the transition matrix by the current state vector: $$T^{-1} * X$$
What is the formula to find the probability of staying in state A?
$$p(A \rightarrow A)$$