Two random variables X and Y have the same distiribution and that X<=Y almost surely. Under this condition I have to show that X=Y almost surely.Any clue to start?

Last edited by macrovue on December 27th, 2008, 11:00 pm, edited 1 time in total.

It has been so long since I did any of this sort of thing (and wasn't even any good at it back then!), so apologies if I'm way off the mark.I'd go for a proof by contradiction. We know that X>Y with probability zero. Assume that X<Y with probability greater than zero. Then, the expectation of X would have to be less than the expectation of Y. And the distributions would be different - hence contradiction. So thereofre X=Y with probability 1 (almost surely).

- ChaitanyaBandi
**Posts:**3**Joined:**

No, its not true at all...for two RVs to be same a.s. they MUST be defined on the same sample space... and conversgence in distribution doesnt gaurantee that.Even if they are defined on the same sample space....take the symmetric random variable X - symmetric wrt 0 . X and -X have the same distribution but no way they are a.s same....

Would you be more specific?

If I were you, I would pay more attention to glipman's post than ChaitanyaBandi's.

While I agree with ChaitanyaBandi that having the same distribution doesn't imply being equal almost surely, with the added statement that X<=Y almost surely, it does become so. Also, we are told that X and Y have the same distribution - so how can they not be defined on the same sample space?

Denote K the class of the random variables for which your statement is true. Then K contains random variables taking a finite number of different values and their linear combinations. Then the statement holds for pointwise, i.e. omega limit for unifrmly bounded random variables. Then extend the class for the bounded r.v bounded with prob.1. This is standard scheme and i think you can find it in some fundamental probability monographs with details.

GZIP: On