Stochastische Processen in de Fysica

Ga naar: navigatie, zoeken

Samenvattingen

Klik hier om de samenvattingen te bekijken

Algemeen

Dit vak wordt gegeven door professor Maes. Probeer zeker al je oefening uit de cursus te maken, want hij durft al eens examenvragen daaruit te kiezen. het examen is volledig schriftelijk. Andere oude examenvragen kunnen gevonden worden onder het vak 'Wiskundige methoden in de natuurkunde' (ook gegeven door professor Maes), waar dit vak vroeger een deeltje van was.

Examens Stochastische Processen

2017-2018

26 januari 2018

Examen 26 januari

1 februari 2018

Examen 1 februari

23 augustus 2018

Examen 23 augustus

2016-2017

27 januari 2017

Examen 27 januari

  • Vraag 1: Imagine that L is the generator of a continuous time Markov Process with a finite state space. Write rho for the stationary distribution and suppose that rho(x) in never zero. for all states x. Show that if the Matrix H with elements H_xy=sqrt(rho(x))L_xy/sqrt(rho(y)) is symmetric, that then detailed balance holds.
  • Vraag 2: Consider a continuous-time Markov preocess {a, b, c} with transition rates k(a, b)=1, k(a, c)=k(c, a)=x, k(b, c)=4x/3 and with all other transistion rates equal to zero. Here x>=0 (groter of gelijk aan) is a parameter. Give the stationary distribution in therms of x. Is the stationary process time-reversal invariant? For which x?
  • Vraag 3: Consider the Markov diffusion process for a position x_t element of R: d(x_t)/dt=-U'(x_t)+sqrt(2T)*ksi_t, where ksi_t is white noise, T>0 and U(x)=x^2/2. At zero time we have x_0=1. Find the time-correlation <x_t x_s>. What is the stationary distribution?
  • Vraag 4: Markov found the following empirial rule for the transition matrix in the vowel-consonant space in Pushkin's novel: zie cursus. Find the vowel versus consonant frequency. Write down the calculations.
  • Vraag 5: Show that the Ehrenfest model satisfies detailed balance, and find the potential. (Je mocht de stationaire distributie gewoon gebruiken, zonder aan te tonen dat ze stationair was)
  • Vraag 6: Show that all Markov chains with two states, |K|=2, satisfy detailed balance, at least when the p(x, y)>0.

2014-2015

17 juni 2015

  • Vraag 1: Consider a continuous time Markov process with state space K = {1, 2, . . . , M} and

with transition rates k(x, x + 1)= q except for x = M, k(x, x − 1) = p except for x = 1. All other transition rates are zero. Determine the stationary distribution as a function of p, q and M. Is there detailed balance? (Oefening 5 deel Continue Markovprocessen)

  • Vraag 2: Lady Ann possesses 3 umbrellas which she employs in going from home to office

and back. If she is at home (resp. office) at the beginning (resp. end) of a day and it is raining, then she will take an umbrella with her to the office (resp. home), at least if there is one to be taken. If it is not raining, then she will not take an umbrella. Assuming that, independent of the past, it rains at the beginning (end) of a day with probability 1/3, what fraction of the time does Lady Ann arrive soaked at the office? (Oefening 15 deel Discrete Markovprocessen)

  • Vraag 3: We consider the overdamped diffusion process with dx/dt = -V'(x_t) + \sqrt(2T)\ksi_t with the \ksi the standard white noise. Show the calculation that the distribution \rho(x) = exp(-V(x)/T)/Z is the only stationary distribution. Explain why a stationary distribution is an equilibrium distribution when it is symmetric under time reversal or satisfies the condition of detailed balance. Determine <x_t>.
  • Vraag 4: Consider a network with four states (x, v) where x ∈ {0, 1}, v ∈ {−1, +1}. (Imagine x

to be a position and v like a velocity.) We define a Markov process in continuous time via transition rates that depend on parameter b > 0, k((1, +1),(1, −1)) = k((1, −1),(1, +1)) = k((0, +1),(0, −1)) = k((0, −1),(0, +1)) = 1, k((1, −1),(0, −1)) = k((0, +1),(1, +1)) = b All other transitions are forbidden. Make a drawing. Determine the stationary distribution on the four states as function of b. Is there detailed balance? (Oefening 8 deel Continue Markovprocessen)