Page **1** of **2**

### Basic Probability

Posted: **April 6th, 2011, 3:55 pm**

by **DoubleTrouble**

Hi!I'm dealing with a problem in which I have to calculate Cov(X,Y) where X and Y are N(0,1) which is equivalent to calculating E[XY]. X and Y is not necessarily independent. Is there any trick for how to do this?Best regards,DT

### Basic Probability

Posted: **April 6th, 2011, 4:07 pm**

by **eh**

QuoteOriginally posted by: DoubleTroubleHi!I'm dealing with a problem in which I have to calculate Cov(X,Y) where X and Y are N(0,1) which is equivalent to calculating E[XY]. X and Y is not necessarily independent. Is there any trick for how to do this?Best regards,DTif you know the join density f(x,y) then

### Basic Probability

Posted: **April 6th, 2011, 4:10 pm**

by **DoubleTrouble**

Yes, of course but I was hoping of there being an easier way or some kind of clever trick since the joint density is so messy!

### Basic Probability

Posted: **April 6th, 2011, 5:21 pm**

by **ACD**

Are they from a multivariate normal distribution or are thy just marginally normally distributed?

### Basic Probability

Posted: **April 6th, 2011, 5:33 pm**

by **DoubleTrouble**

QuoteOriginally posted by: ACDAre they from a multivariate normal distribution or are thy just marginally normally distributed?They are not from a multivariate normal distribution. I have X_1, ..., X_n random variables which are all N(0,1).

### Basic Probability

Posted: **April 6th, 2011, 6:19 pm**

by **ACD**

QuoteOriginally posted by: DoubleTroubleQuoteOriginally posted by: ACDAre they from a multivariate normal distribution or are thy just marginally normally distributed?They are not from a multivariate normal distribution. I have X_1, ..., X_n random variables which are all N(0,1).Ok can't really help then, sorry. beyond eh's suggest the only thing I can think of is to do it by MC, if you can simulate from the joint distribution (which I would guess not as this is a fairly obvious thing to do) then you could simulate and empirically calculate the average covariance on the results as an estimate,,,

### Basic Probability

Posted: **April 7th, 2011, 8:54 am**

by **emac**

What is the joint density

### Basic Probability

Posted: **April 7th, 2011, 3:50 pm**

by **quantmeh**

QuoteOriginally posted by: DoubleTroubleHi!I'm dealing with a problem in which I have to calculate Cov(X,Y) where X and Y are N(0,1) which is equivalent to calculating E[XY]. X and Y is not necessarily independent. Is there any trick for how to do this?Best regards,DTdo you know what is a marginal density?to get joint probability from marginals you need additional assumptions

### Basic Probability

Posted: **April 7th, 2011, 5:36 pm**

by **DoubleTrouble**

I figured out that it must be a misprint in my instructions. I have proved that the variables must be independent or at least uncorrelated for the result to be possible!But thank you very much for taking your time to answer. I will soon have another question in basic probability!/DT

### Basic Probability

Posted: **April 7th, 2011, 6:12 pm**

by **DoubleTrouble**

Another problem that I can't solve. I need to prove that if (X,Y) is a normally distributed random vector and both X and Y have 0 mean but unknown variance (different variance for X and Y of course), then:E[X^2 Y^2] >= E[E^2]E[Y^2] i.e. X^2 and Y^2 are positively correlated.This has bothered me to insane levels. I have tried everything I can figure (even trying to use definition of expected value i.e. integrals). I think that using characteristic functions is the best way to go, but I can't get anything good!

### Basic Probability

Posted: **April 7th, 2011, 7:48 pm**

by **bearish**

Let W and Z be two standard independent normals and write X=aW, Y=b(cW+sqrt(1-c^2)Z); where a and b are the standard deviations of X and Y, and c is the correlation between X and Y. Direct computation then gives E(X^2Y^2)=a^2b^2*(1+2c^2).

### Basic Probability

Posted: **April 7th, 2011, 8:54 pm**

by **quantmeh**

QuoteOriginally posted by: DoubleTroubleE[X^2 Y^2] >= E[E^2]E[Y^2] lookup jensen's inequality

### Basic Probability

Posted: **April 8th, 2011, 4:32 am**

by **DoubleTrouble**

QuoteOriginally posted by: bearishLet W and Z be two standard independent normals and write X=aW, Y=b(cW+sqrt(1-c^2)Z); where a and b are the standard deviations of X and Y, and c is the correlation between X and Y. Direct computation then gives E(X^2Y^2)=a^2b^2*(1+2c^2).Thank you for your reply. But if you define Y like that you will not get variance b and won't X and Y be independent if you do it like that?

### Basic Probability

Posted: **April 8th, 2011, 4:33 am**

by **DoubleTrouble**

QuoteOriginally posted by: quantmehQuoteOriginally posted by: DoubleTroubleE[X^2 Y^2] >= E[E^2]E[Y^2] lookup jensen's inequalityI've never used it in 2-dim before. Is that okay? And also, X^2 Y^2 is not convex in R^2!

### Basic Probability

Posted: **April 8th, 2011, 12:39 pm**

by **bearish**

If you cannot work through the algebra, at least do a little experiment in Excel. X and Y will not be independent, they will have correlation c, and b is the standard deviation of Y (the square root of the variance).