I am constructing a bivariate Farlie-Gumbel-Morgenstern distributions with Normal and exponential marginal as follows:
G = 1 - E^(-s \[Lambda]);
Fe = CDF[NormalDistribution[\[Mu], \[Sigma]], \[Epsilon]];
He = Fe G (1 + 3 \[Rho] (1 - Fe) (1 - G));
h = FullSimplify[D[He, \[Epsilon], s]];
We can calculate conditional mean of \[Epsilon]
given s as follows
condmean\[Epsilon] =
Assuming[{\[Lambda] > 0, -1/3 <= \[Rho] <= 1/3, \[Sigma] > 0}, \!\(
\*SubsuperscriptBox[\(\[Integral]\), \(-\[Infinity]\), \
\(\[Infinity]\)]\(\[Epsilon]\ \((h/
Assuming[{\[Lambda] > 0, Abs[\[Rho]] <= 1/3, \[Sigma] > 0},
\*SubsuperscriptBox[\(\[Integral]\), \(-\[Infinity]\), \
\(\[Infinity]\)]h \[DifferentialD]\[Epsilon]])\)\ \[DifferentialD]\
\[Epsilon]\)\)];
My Answer is
condmean\[Epsilon] = \[Mu] + (3 E^(-s \[Lambda]) (-2 + E^(
s \[Lambda])) \[Rho] \[Sigma])/Sqrt[\[Pi]];
Now I go and define following functions to generate random variables
FGM[\[Mu]_, \[Sigma]_, \[Rho]_, \[Lambda]_] :=
Assuming[{\[Lambda] > 0, Abs[\[Rho]] <= 1/3, \[Sigma] > 0},
CopulaDistribution[{"FGM", \[Rho]}, {NormalDistribution[\[Mu], \
\[Sigma]], ExponentialDistribution[\[Lambda]]}]];
Rv2[\[Mu]_, \[Sigma]_, \[Rho]_, \[Lambda]_, Nf_] := Module[{x},
x = RandomVariate[FGM[\[Mu], \[Sigma], \[Rho], \[Lambda]], Nf];
Table[{x[[i, 1]], x[[i, 2]]}, {i, 1, Nf}]
I fixed the parameters as follows:
\[Mu] = 0;
\[Sigma] = .0236;
\[Rho] = .2;
\[Lambda] = .006;
Nf = 10000;
I simulated data 1000 times and find mean of mean. I do not get the values close to each other. The values are three to four fold apart.
Finally, I write my own function to generate random numbers based on Devroye's (1986) free book "Non-uniform random variate generation" page 580-581 as follows:
fgm[\[Mu]_, \[Sigma]_, \[Rho]_, \[Lambda]_, nf_] :=
Module[{d, U, V, lt, gt, Fx, Gy},
d = 3*\[Rho];
Fx = RandomReal[{0, 1}, nf] /. .5 -> .49999;
U = RandomReal[{0, 1}, nf];
V = RandomReal[{0, 1}, nf];
vFx = -(V)/(Abs[d]*(2*Fx - 1));
lt = # < .5 & /@ Fx /. {True -> 1, False -> 0};
pos[x_] := (# > 0 & /@ x /. {True -> 1, False -> 0})*x;
min[a_, b_] := b - pos[b - a];
max[a_, b_] := a + pos[b - a];
Gy = lt*min[U, vFx] + (1 - lt)*max[U, 1 + vFx];
Gy = If[\[Rho] < 0, (1 - Gy), Gy];
(* Fx and Gy are correlated uniform random variates *)
(* desired distribution is done through inverse lookup *)
e = InverseCDF[NormalDistribution[\[Mu], \[Sigma]], Fx]; (*
normal variate *)
gt = # >= 1 & /@ Gy /. {True -> 1, False -> 0};
s = -Log[1 - gt*.999999 - (1 - gt)*Gy]/\[Lambda];
(* Exponential Variate *)
fgmout = Transpose[{e, s}];
fgmout
]
Again I use the above function to generate data and find the mean as above. This time I get pretty close results for simulated data and my theoretical mean.
Finally my question: Is random number generation based on Mathematica's Farlie-Gumbel-Morgenstern copula correct? or I am doing something stupid.
Any help is greatly appreciated.