Forums > Basics > White Noise and Wiener Processes

 Page 1 of 1
Display using:
 IAmEric Phorgy PhynanceBanned Total Posts: 2961 Joined: Oct 2004
 Posted: 2006-02-17 03:36 What is the relation between the two?A quick dose of Googling tells me something along the lines of "white noise is the derivative of a Wiener process". Could someone shed some light and maybe point to some better references?Thanks Eric
 IAmEric Phorgy PhynanceBanned Total Posts: 2961 Joined: Oct 2004
 Posted: 2006-02-17 03:58 Follow up...ConsiderdS/S = mu*dt + sigma*dWwithS(t) = S(0)*exp(mu'*t + sigma*W)so thatlog[S(t)/S(0)] = mu'*t + sigma*W.Has anyone ever looked at modelling this aslog[S(t)/S(0)] = mu'*t + sigma*W',where W' is Gaussian white noise? Just curious.Here is kind of a neat thing:g(t)*W(t) = int_0^t g'(tau)*W(tau) dtau + int_0^t g(tau)*W'(tau) dtauwhere W(t) is a Wiener process and W'(t) is Gaussian white noise.I got that from this paper.
 Anthis It's all Greek to me Total Posts: 1180 Joined: Jul 2004
 Posted: 2006-02-17 04:23 What is the relation between the two? Just a quick answer before i go to bed... In discrete time econometrics, the white noise is called the error term with constant variance. If the variance is time varying then we have a random walk, and a random walk is the discrete time version of a Wiener process. HTH Αίεν Υψικράτειν/Τύχη μη πίστευε/Άνδρα Αρχή Δείκνυσι/Νόησις Αρχή Επιστήμης //Σε ενα κλουβί γραφείο σαν αγρίμι παίζω ατέλειωτο βουβό ταξίμι
 IAmEric Phorgy PhynanceBanned Total Posts: 2961 Joined: Oct 2004
 Posted: 2006-02-17 04:43 Thanks anthis. Not quite sure what you meant, but econometrics is actually the motivation for the question. I am trying to understand the relation if any between (vector) autoregression and (multi-factor) stochastic differential equations.If you or anyone has some light to shed on that as well, it would be more than welcome.Here is another question...We know that, at least mechanically, we can write things likedW dW = dtin stochastic calculus. If W' is (Gaussian) white noise, is there some similar nifty expression fordW' dW' ?The answer to this and some of my other questions might be found in one of these:White Noise Calculus for Pure Jump Processes with Application to Mathematical Finance Incomplete Equilibrium MarketsSOME APPLICATIONS OF WHITE NOISE ANALYSIS TO MATHEMATICAL FINANCE
 Johnny Founding Member Total Posts: 4331 Joined: May 2004
 Posted: 2006-02-17 09:12 White light, white heat White noise is the signal generated by combining sine waves of all frequencies in equal proportions. It is the same as Brownian motion. Consider the defining properties of Brownian motion and check that they also correspond to white noise: 1. Continuity: Wt is continuous in t (check, it's the sum of sine waves) and Wo = 0 (check, set all sine waves to "start" at t=0).2. Normality: Wt is distributed N(0,t) 3. Normally distributed independent increments: Ws - Wt is distributed N(0, s-t) for s>t So far, so good. But enquiring minds probably want to think about (a) differentiability and (b) stochasticity vs deterministicity. I can never remember the exact definition of a Wiener process, but isn't it (something like) any process of the form dXt = f(Xt, t) dt + g(Xt, t) dWt where dWt is the increment of a Brownian motion. I'm not sure this is what you're looking for. The sound of one bear, uh, in the woods
 Nonius Founding MemberNonius Unbound Total Posts: 11665 Joined: Mar 2004
 Posted: 2006-02-17 10:52 Johnny's first comments are correct.  But, it is not the same as Brownian Motion.  It is the "derivative" of Brownian Motion.  In this sense, Eric's intuition is correct. If you look hard you will see NONIUS in CMB data. Muhauaha.
 Johnny Founding Member Total Posts: 4331 Joined: May 2004
 Posted: 2006-02-17 11:08 Nonius is correct and I was wrong. But let us not say that white noise is the derivative of Brownian motion but instead say that the integral of white noise is the same as Brownian motion. The integral of white noise is known (for it's connection with Brownian motion) as Brown noise. And many thanks to all those people that pointed this out to me in private. The sound of one bear, uh, in the woods
 Cheng Total Posts: 2585 Joined: Feb 2005
 Posted: 2006-02-17 11:09 Eric,white noise may be viewed as the derivate of a Wiener process. Unfortunately white noise has infinite variance and the nifty techniques one usually uses don't work here. The derivative exists only in the sense of distributions, a kind of generalized derivatives. If white noise were an integrator in the Lebesgue-Stieltjes-sense you could rewrite any stochastic Itô integral as\int_a^b f(s) dW_s = \int_a^b f(s)W'(s) ds,W'(s) being the white noise.I can dig up my old scripts and books at home if you want to know more. Hope this helps a bit building an intuition.Regards "Don't try to run, don't try to hide. Believe me, the hammer's gonna make it right !"
 Nonius Founding MemberNonius Unbound Total Posts: 11665 Joined: Mar 2004
 Posted: 2006-02-17 11:15 It is also in the sense of Eric's NCG calculus that one can view white noise as the noncommutative derivative of brownian motion. If you look hard you will see NONIUS in CMB data. Muhauaha.
 Martingale NP House Mouse Total Posts: 2590 Joined: Jun 2004
 Posted: 2006-02-17 13:52 it is still too early for me to think, but the white noise way of doing things are probably most done in engineering related stuff, but hey... you never know, here is something that might be of a little interest... http://www.ma.ic.ac.uk/~pavl/ver3.pdf
 IAmEric Phorgy PhynanceBanned Total Posts: 2961 Joined: Oct 2004
 Posted: 2006-02-17 18:38 Nonius saw through my thinly veiled attempt to bring NCG into the picture It is true, if someone could tell of any special algebraic relation fordW' dW'where W' is white noise, then I could reformulate white noise in terms of NCG, which might be kind of academically interesting.The real point of this is, as I said, to try to understand the link (if any) between vector autoregression and multifactor stochastic differential equations. I'm just now learning the former and it is reminding me a lot of the latter.Looking at VAR reminded me of my days back in grad EE and DSP, e.g. z-transforms, digital filters, impulse responses, etc., which made me think of deconvolutions, which made me think... (ad nauseum)
 hammerbacher Total Posts: 189 Joined: Aug 2005
 Posted: 2006-02-17 20:27 this was precisely how the NYU math in finance "stochastic calculus" course taught by marco avellaneda last semester began--by pounding on the metaphor of white noise as a "derivative" of brownian motion. he's taken the homework files off the website, but i can send them to anyone interested in playing (numerically) with this stuff. it was very good for intuition.disclaimer: i stopped attending this course after about three lectures, as avellaneda is a horrible lecturer. the psets are still decent, however.
 IAmEric Phorgy PhynanceBanned Total Posts: 2961 Joined: Oct 2004
 Posted: 2006-02-17 20:32 hammerbacher,Sounds cool. I'd be interested in taking a look at that and any notes you might have.Thanks
 tristanreid Total Posts: 1676 Joined: Aug 2005
 Posted: 2006-02-17 23:08 in regards to vector autoregression, my understanding is this: AR: where that last epsilon is your white noise.  if phi<1, this series is covariance-stationary, so you can take the expected value of both sides.  E(epsilon)=0, so the expected value of the series is: , or mu=c/(1-phi). In other words, it's a mean reverting series. if phi==1, y_t is a Wiener process (a random walk), it has a 'unit root'.  you can fix this by first differencing the series.  since the derivative is white noise, you are reduced to the above case. if phi>1, this time series is explosive, which is just no good.  VAR: instead of phi, you have a matrix of coefficients, so you take the eigenvalues to see what effect each eigenvector is having.  You check the effect in the same way as above: any lambda>1 is explosive and makes the series worthless.  If all the lambdas are < 1 there is a static equilibrium to the system (kind of like mean-reversion). The difference with VAR is that some lambdas could be random walks and others stationary.  That's where cointegration comes in (from that other thread) -t. the only reason it would be easier to program in C is that you can't easily express complex problems in C, so you don't. -comp.lang.lisp
 IAmEric Phorgy PhynanceBanned Total Posts: 2961 Joined: Oct 2004
 Posted: 2006-02-18 02:24 Making some progress...Here is kind of a neat thing:g(t)*W(t) = int_0^t g'(tau)*W(tau) dtau + int_0^t g(tau)*W'(tau) dtauwhere W(t) is a Wiener process and W'(t) is Gaussian white noise.I got that from this paper.In the above expression, the function g(t) must be smooth and have finite support. If we let g(t) = 1 up to some time T and then smoothly taper to zero beyond, then the above reduces to W(t) = int_0^t W'(tau) dtaufor t < T. This expresses what others have said about Brownian motion being the integral of white noise.If we approximate this integral as a Riemann sum, we getW(t) ~= W(t-delt) + delta W'(t).Comparing this to what tristan said for c = 0 and phi = 1, we havey(t) = y(t-1) + e(t).This seems to support the statement that y(t) is Brownian motion for phi = 1. At least if you were to throw in a delt in there and take a limit as delt -> 0. I can buy that This is looking like it supports my suspicion that AR could be thought of as a finite difference approximation to a stochastic DE (I think).Still some mysteries though. If anyone could shed some light, it'd be appreciated.For example, the integral impliesdW = W' dt.We know thatdW dW = dtso that means that W' ~ 1/sqrt(dt) sincedW dW = (W')^2 dt dt = dt.If we wave the wand of NCG, we convert this to a commutator[dW, W] = dW W - W dW= W' dt W - W W' dt= W' W dt - W W' dt (W and dt commute)= dt.The only way to satisfy this is if[W, W'] = 1which looks like a quantization rule to me. I'll try not to get pulled off onto that tangent right now (Note for anyone interested, the commutative relation [dW, W] = i*hbar*dt leads to the Schrodinger equation and in this case would lead to [W, W'] = i*hbar, which looks even MORE like quantization where white noise and Brownian motion play the role of conjugate variables like position and momentum.)Back to something remotely practical...If we want to solvedy = mu*dt + sigma*dWapproximately we can rewrite it asdy = mu*dt + sigma*dW = mu*dt + sigma*W'*dt.The commutative relations suggest W' ~ 1/sqrt(delt) so this becomesy(t) - y(t-delt) = mu*delt + sigma*W'(t)*delt = mu*delt + sigma*e(t)*sqrt(delt)where I've setW'(t) = e(t)/sqrt(delt)and I really don't know exactly why other than to make it look like a simple Monte Carlo expression (and it seems to somehow relate to the dimension analysis from the commutative relations) Rearranging terms givesy(t) = mu*delt + y(t-delt) + sigma*e(t)*sqrt(delt).Setting c = mu*delt and epsilon(t) = sigma*e(t)*sqrt(t)brings us back to something that formally looks like the AR expression.Ok. As you can see, there are still some holes in my arguments so if anyone can help nail the final missing pieces, I'd appreciate it. That, or show me where I'm totally off the mark. Just to summarize...This whole thing is motivated by trying to understand the relation between ARs and SDEs. My hunch is that AR may be interpretted as an approximate numerical (finite difference/Monte Carlo) solution to a simple SDE.This might be obvious to some, so help me see the light. I'm almost there Cheers Eric
 Cheng Total Posts: 2585 Joined: Feb 2005
 Posted: 2006-02-18 10:40 It is still a bit early to think and maybe I am babbling big nonsense buty(t) = mu*delt + y(t-delt) + sigma*e(t)*sqrt(delt)boils down to the Euler scheme for numerical solution of SDEs if e(t) is some N(0,1) distributed rv. Then you get the incrementsigma*sqrt(delt)*N(0,1)or alternativelyN(0,sigma^2*delt).Under some technical assumptions this converges to the SDE for delt->0 with order O(delt^(-1/2)). If mu and sigma are C^1 you can get convergence of order O(1/delt) by adding another term involving the first order derivatives (the Milstein scheme). Maybe this helps, if not feel free to trash it.Regards "Don't try to run, don't try to hide. Believe me, the hammer's gonna make it right !"
 kr Founding MemberNP Raider Total Posts: 3561 Joined: Apr 2004
 Posted: 2006-02-18 12:23 eric - haven't read all the details, but my intuition is simply that dW should be thought of as a 'generalized function' - i.e. think dirac delta.  That is, it is defined only to the extent that one can compute = int_0^t g(t) dW_t, and the value of this functional is mostly determined by a few key algebraic relations.  In fact, writing is not the right way to go, much better to have W = <1, dW> be thought of as a random variable, and then use the rules for .  Once that's said, it's clear that "dW'_t" does not operate on the correct space... it would have to be integrated against something else in order to produce a generalized function.  Somewhere under all this is a deep-discount graded algebra involving functions with existing derivatives. my bank got pwnd
 Cheng Total Posts: 2585 Joined: Feb 2005
 Posted: 2006-02-18 13:14 kr,do you think along the lines of Sobolev spaces ? This was basically what I had in mind when I started mumbling about generalized derivatives.Regards "Don't try to run, don't try to hide. Believe me, the hammer's gonna make it right !"
 Nonius Founding MemberNonius Unbound Total Posts: 11665 Joined: Mar 2004
 Posted: 2006-02-18 16:19 you are on the right track If you look hard you will see NONIUS in CMB data. Muhauaha.
 Martingale NP House Mouse Total Posts: 2590 Joined: Jun 2004
 Posted: 2006-02-18 17:44 The origional way of Wiener's way to construct the stochastic integral is kind of from white noise as generalized distribution ( throgh some L^2 isometry), but it appoved to be limited, that's why people start to consider this from some other point... I will try to dig some history literature on this
 urs Total Posts: 2 Joined: Feb 2006
 Posted: 2006-02-19 10:37 Hi Eric!I thought I might just as well post a reply here.You wroteWe know thatdW dW = dtOk.so that means that W' ~ 1/sqrt(dt) sincedW dW = (W')^2 dt dt = dt.This is making me a little nervous. But I realize my stochastic calculus is rusty.If we wave the wand of NCGNow I feel more at home...[dW, W]= dW W - W dW= W' dt W - W W' dt= W' W dt - W W' dt (W and dt commute)= dt.I understand[dW,W] = dt .But does it make sense to write dW = W' dt ??Is there any discrete 0-form W' with this property?I wouldn't think so, but it has been a while since I thought about this stuff.But consider a 2D diamond graph. Introduce "lightcone" variables A and B such that t = A + B x = A - B .Then dt = dA + dB dx = dA - dB and there is manifestly no 0-form f such that dx = f dt .But in your example above W=x, if I understood correctly.Do you agree? Maybe I am missing your point.On the other hand, maybe you are right and it is useful to *introduce* new formal expressions like W' that are defined to satisfy relations like dW = W' dt. Haven't thought enough about that.
 IAmEric Phorgy PhynanceBanned Total Posts: 2961 Joined: Oct 2004