We use
k
light bulbs to light an outside rink. The person responsible for the lighting of the rink does not keep spare light bulbs. Rather, he orders, at the beginning of each week, new light bulbs to replace the ones that burned out during the preceding week. These light bulbs are delivered the following week. Let
Xn
be the number of light bulbs in operation at the beginning of the nth week, and let Yn
be the number of light bulbs that will burn out during this
nth week, for n = 0 , 1 , . . . . We assume that, given that
Xn —
, the random variable
Yn
has a
discrete uniform
distribution over the set { 0 , 1 , . . . , i}:
(a) Calculate the one-step transition probability matrix of the Markov chain {Xn, n = 0 , l , . . . }.
(b) Show that the limiting probabilities of the chain
{Xn,
n = 0 , 1 , . . . } exist and are given by