Notices
Results 1 to 10 of 10

Thread: Conditional expectations

  1. #1 Conditional expectations 
    New Member
    Join Date
    Feb 2008
    Posts
    4
    If X and Y are two independent random variables that take real values, is it true that
    E(X) <= E(X|X>=Y) ?
    Independence must be crucial, because if X and Y are not independent it is easy to build examples in which the statement is false.
    Can anyone help me?


    Reply With Quote  
     

  2.  
     

  3. #2  
    Forum Bachelors Degree Demen Tolden's Avatar
    Join Date
    Sep 2007
    Location
    St. Paul, Minnesota, USA
    Posts
    475
    Is your notation correct? You didn't make a typo did you?


    The most important thing I have learned about the internet is that it needs lot more kindness and patience.
    Reply With Quote  
     

  4. #3  
    New Member
    Join Date
    Feb 2008
    Posts
    4
    I think I didn't. The question is:
    If X and Y are two independent random variables that take real values, is it true that:
    E(X) is less or equal to E(X given that X is larger or equal to Y) ?
    With several distribution (e.g. Frechet, Weibull, Pareto, Uniform) it is very easy to find out that it is true (just by making the computation). I am wondering whether it is also true in general, given independence of X and Y. (If they are not independent, in fact, I am sure that the statement is false.)
    [E is the expectations operator]
    Reply With Quote  
     

  5. #4  
    Forum Bachelors Degree Demen Tolden's Avatar
    Join Date
    Sep 2007
    Location
    St. Paul, Minnesota, USA
    Posts
    475
    bit4bit or serpico may have to take this question. I don't know what an expectations operator is, and my web searches have revealed nothing.
    The most important thing I have learned about the internet is that it needs lot more kindness and patience.
    Reply With Quote  
     

  6. #5  
    Forum Masters Degree bit4bit's Avatar
    Join Date
    Jul 2007
    Posts
    621
    Its beyond me too. Something to do with stats/probability I think. serpicojr probably knows....
    Reply With Quote  
     

  7. #6  
    Forum Professor serpicojr's Avatar
    Join Date
    Jul 2007
    Location
    JRZ
    Posts
    1,069
    I think I found a way. My probability is shaky (partly because I'm a pure mathematician and so use real analysis terms when doing probability, although I'm going to refrain from such now), so let me know if I cross the line in any way.

    So I'm starting with the assumption that:

    (1) E(X|X≥Y) = E(X*χ<sub>{X≥Y}</sub>)/P(X≥Y)

    where χ<sub>{X≥Y}</sub> is the indicator function of the set {X≥Y}. We're trying to show:

    (2) E(X|X≥Y) ≥ E(X)

    Which is the same as showing, by my assumption in (1):

    (3) E(X*χ<sub>{X≥Y}</sub>) ≥ E(X) P(X≥Y)

    We can show this if, for each real number y, we can show:

    (4) E(X*χ<sub>{X≥y}</sub>) ≥ E(X) P(X≥y)

    If this is true, then we obtain (3) by integrating (4) over y against the probability density function of Y. This fact seems pretty intuitive. There are three cases:

    i. P(X≥y) = 0. Then both sides are 0.

    ii. P(X≥y) = 1. Then both sides are equal to E(X).

    iii. P(X≥y) and P(X<y) are both nonzero. Then we certainly have:

    (5) E(X|X≥y) ≥ y ≥ E(X|X<y)

    Using an analog to (1) with Y replaced by y, we have:

    (6) E(X*χ<sub>{X≥y}</sub>)/P(X≥y) ≥ E(*χ<sub>{X<y}</sub>)/P(X<y)

    Doing some algebra:

    (7) E(X*χ<sub>{X≥y}</sub>) P(X<y) ≥ E(X*χ<sub>{X<y}</sub>) P(X≥y)

    And adding E(X*χ<sub>{X≥y}</sub>) P(X≥y) to both sides:

    (8) E(X*χ<sub>{X≥y}</sub>)(P(X<y) + P(X≥y)) ≥ (E(X*χ<sub>{X≥y}</sub>) + E(X*χ<sub>{X<y}</sub>))P(X≥y)

    And since P(X<y) + P(X≥y) = 1 and E(X*χ<sub>{X≥y}</sub>) + E(X*χ<sub>{X<y}</sub>) = E(X), we have (4).

    Does this look good?
    Reply With Quote  
     

  8. #7  
    New Member
    Join Date
    Feb 2008
    Posts
    4
    Looks wrong. The result in fact is not true if X and Y are not independent and it seems to me you have never used independence there. Let me make a counterexample.
    Suppose that X takes values 3 and 5 with equal probability and Y takes values 2 and 6 with equal probability. Then E(X)=4. Now suppose that if X=3 then Y=2 and if X=5 then Y=6 (note, X and Y are not independent). Now there is only one case in which X>Y which is (3,2); therefore E(X)=4 < E(X|X>=Y)=3, contrary to your result.
    If you tell me how to write integrals or how to upload files I will show where I am stuck with the proof. So far, I have been able to prove the result only in two special cases: if Y is a constant (i.e. degenerate random variable) and if E(X)>E(Y) ), but I think the result should always hold (provided that X and Y are independent).
    Serpico is there anyone teaching probability theory in your department?
    Reply With Quote  
     

  9. #8  
    Forum Professor serpicojr's Avatar
    Join Date
    Jul 2007
    Location
    JRZ
    Posts
    1,069
    I did use independence, although not explicitly, and I apologize for this. I used it in justifying that we can derive (3) from (4). For example, consider the left hand sides. (Since I don't want to format things, I'm going to let a set denote its own indicator function.)

    E(X{X≥y})

    is an expectation against X's probability distribution. Integrating over y against Y's probability distribution, we get:

    E(E(X{X≥y}))

    The outer E is an integral over Y's distribution, the inner over X's. Independence allows me to conclude that this is equal to:

    E(X{X≥Y})

    where this is an expectation against the joint probability distribution of X and Y. Independence is equivalent to the fact that the joint distribution is the product of the individual distributions. Independence also allows me to calculate that E(P(X≥y)) = P(X≥Y).

    If this is your only complaint, I've addressed it, and I think the rest stands.

    ------------------------------

    How did you prove it in the specific cases? It seems like it shouldn't be too hard to construct a general proof from a few specific arguments--there must be a common thread amongst them. In any case, you can do some rudimentary formatting by using < sup > and < sub >, and you can find the integral sign floating around in some of our other topics (e.g., the one called "Chapter 8...").
    Reply With Quote  
     

  10. #9  
    New Member
    Join Date
    Feb 2008
    Posts
    4
    Serpico thanks a lot for your help and sorry for the late reply. Your argument looks correct. Indeed it can be even simplified somewhat. I am only missing one key point:
    how do you prove that

    E(E(X{X≥y}))=E(X{X≥Y}) for X and Y independent?

    I get lost when I specify and try to solve the two integrals. Could you go over all the passages? Sorry for bothering you again and thanks in advance.
    Reply With Quote  
     

  11. #10  
    Forum Professor serpicojr's Avatar
    Join Date
    Jul 2007
    Location
    JRZ
    Posts
    1,069
    So independence means that the joint probability distribution of X and Y is just the product of the probability distributions of X and of Y. So the expression:

    E(E(X{X≥y})

    really means:



    Now let A be the set in the plane {(x,y): x ≥ y}. Then:



    By independence:

    Reply With Quote  
     

Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •