Page 1 of 1

2 RVs with same distribution should be same a.s ?

Posted: December 28th, 2008, 12:26 pm
by macrovue
Two random variables X and Y have the same distiribution and that X<=Y almost surely. Under this condition I have to show that X=Y almost surely.Any clue to start?

2 RVs with same distribution should be same a.s ?

Posted: December 28th, 2008, 12:39 pm
by gjlipman
It has been so long since I did any of this sort of thing (and wasn't even any good at it back then!), so apologies if I'm way off the mark.I'd go for a proof by contradiction. We know that X>Y with probability zero. Assume that X<Y with probability greater than zero. Then, the expectation of X would have to be less than the expectation of Y. And the distributions would be different - hence contradiction. So thereofre X=Y with probability 1 (almost surely).

2 RVs with same distribution should be same a.s ?

Posted: December 28th, 2008, 3:35 pm
by ChaitanyaBandi
No, its not true at all...for two RVs to be same a.s. they MUST be defined on the same sample space... and conversgence in distribution doesnt gaurantee that.Even if they are defined on the same sample space....take the symmetric random variable X - symmetric wrt 0 . X and -X have the same distribution but no way they are a.s same....

2 RVs with same distribution should be same a.s ?

Posted: December 28th, 2008, 5:31 pm
by macrovue
Would you be more specific?

2 RVs with same distribution should be same a.s ?

Posted: December 28th, 2008, 7:07 pm
by Alan
If I were you, I would pay more attention to glipman's post than ChaitanyaBandi's.

2 RVs with same distribution should be same a.s ?

Posted: December 28th, 2008, 9:30 pm
by gjlipman
While I agree with ChaitanyaBandi that having the same distribution doesn't imply being equal almost surely, with the added statement that X<=Y almost surely, it does become so. Also, we are told that X and Y have the same distribution - so how can they not be defined on the same sample space?

2 RVs with same distribution should be same a.s ?

Posted: December 28th, 2008, 11:08 pm
by list
Denote K the class of the random variables for which your statement is true. Then K contains random variables taking a finite number of different values and their linear combinations. Then the statement holds for pointwise, i.e. omega limit for unifrmly bounded random variables. Then extend the class for the bounded r.v bounded with prob.1. This is standard scheme and i think you can find it in some fundamental probability monographs with details.