ECE342 Course Notes

September 15, 2017 | Author: Jordan Chipchura | Category: Probability Distribution, Probability Theory, Probability Density Function, Expected Value, Measure Theory
Share Embed Donate


Short Description

ECE 342 Course Notes Probability and Statistics for Electrical Engineers...

Description

ECE342- Probability for Electrical & Computer Engineers C. Tellambura and M. Ardakani

Winter 2013 Copyright ©2013 C. Tellambura and M. Ardakani. All rights reserved.

Contents 1 Basics of Probability Theory 1.1 Set theory . . . . . . . . . . . . . . . . . . 1.1.1 Basic Set Operations . . . . . . . . 1.1.2 Algebra of Sets . . . . . . . . . . . 1.2 Applying Set Theory to Probability . . . . 1.3 Probability Axioms . . . . . . . . . . . . . 1.4 Some Consequences of Probability Axioms 1.5 Conditional probability . . . . . . . . . . . 1.6 Independence . . . . . . . . . . . . . . . . 1.7 Sequential experiments and tree diagrams 1.8 Counting Methods . . . . . . . . . . . . . 1.9 Reliability Problems . . . . . . . . . . . . 1.10 Illustrated Problems . . . . . . . . . . . . 1.11 Solutions for the Illustrated Problems . . . 1.12 Drill Problems . . . . . . . . . . . . . . . . 2 Discrete Random Variables 2.1 Definitions . . . . . . . . . . . . . . . . . . 2.2 Probability Mass Function . . . . . . . . . 2.3 Cumulative Distribution Function (CDF) . 2.4 Families of Discrete RVs . . . . . . . . . . 2.5 Averages . . . . . . . . . . . . . . . . . . . 2.6 Function of a Random Variable . . . . . . 2.7 Expected Value of a Function of a Random 2.8 Variance and Standard Deviation . . . . . 2.9 Conditional Probability Mass Function . . 2.10 Basics of Information Theory . . . . . . . 2.11 Illustrated Problems . . . . . . . . . . . . 2.12 Solutions for the Illustrated Problems . . . 2.13 Drill Problems . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . . .

1 1 1 2 2 2 3 3 4 4 4 5 5 11 20

. . . . . . . . . . . . .

29 29 29 30 30 31 32 32 33 33 34 35 39 46

iv

CONTENTS

3 Continuous Random Variables 3.1 Cumulative Distribution Function . . . . . 3.2 Probability Density Function . . . . . . . . 3.3 Expected Values . . . . . . . . . . . . . . 3.4 Families of Continuous Random Variables 3.5 Gaussian Random Variables . . . . . . . . 3.6 Functions of Random Variables . . . . . . 3.7 Conditioning a Continuous RV . . . . . . . 3.8 Illustrated Problems . . . . . . . . . . . . 3.9 Solutions for the Illustrated Problems . . . 3.10 Drill Problems . . . . . . . . . . . . . . . . 4 Pairs of Random Variables 4.1 Joint Probability Mass Function . . . 4.2 Marginal PMFs . . . . . . . . . . . . 4.3 Joint Probability Density Function . 4.4 Marginal PDFs . . . . . . . . . . . . 4.5 Functions of Two Random Variables . 4.6 Expected Values . . . . . . . . . . . 4.7 Conditioning by an Event . . . . . . 4.8 Conditioning by an RV . . . . . . . . 4.9 Independent Random Variables . . . 4.10 Bivariate Gaussian Random Variables 4.11 Illustrated Problems . . . . . . . . . 4.12 Solutions for the Illustrated Problems 4.13 Drill Problems . . . . . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

55 55 56 56 57 58 59 60 60 64 69 75 75 75 76 76 76 77 78 78 79 80 80 83 89

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

5 Sums of Random Variables 5.1 Summary . . . . . . . . . . . . . . . . . . . 5.1.1 PDF of sum of two RV’s . . . . . . . 5.1.2 Expected values of sums . . . . . . . 5.1.3 Moment Generating Function (MGF) 5.2 Illustrated Problems . . . . . . . . . . . . . 5.3 Solutions for the Illustrated Problems . . . . 5.4 Drill Problems . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

93 . 93 . 93 . 93 . 94 . 94 . 96 . 100

. . . . . .

103 103 104 105 106 107 108

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . . . . .

1 2 3 4 5 6

. . . . . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . . . . .

A 2009 Quizzes A.1 Quiz Number A.2 Quiz Number A.3 Quiz Number A.4 Quiz Number A.5 Quiz Number A.6 Quiz Number

. . . . . . . . . . . . .

. . . . . . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

CONTENTS

v

A.7 Quiz Number 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 A.8 Quiz Number 8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 B 2009 Quizzes: Solutions B.1 Quiz Number 1 . . . . B.2 Quiz Number 2 . . . . B.3 Quiz Number 3 . . . . B.4 Quiz Number 4 . . . . B.5 Quiz Number 5 . . . . B.6 Quiz Number 6 . . . . B.7 Quiz Number 7 . . . . B.8 Quiz Number 8 . . . . C 2010 Quizzes C.1 Quiz Number C.2 Quiz Number C.3 Quiz Number C.4 Quiz Number C.5 Quiz Number C.6 Quiz Number C.7 Quiz Number

1 2 3 4 5 6 7

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

D 2010 Quizzes: Solutions D.1 Quiz Number 1 . . . . D.2 Quiz Number 2 . . . . D.3 Quiz Number 3 . . . . D.4 Quiz Number 4 . . . . D.5 Quiz Number 5 . . . . D.6 Quiz Number 6 . . . . D.7 Quiz Number 7 . . . . E 2011 Quizzes E.1 Quiz Number E.2 Quiz Number E.3 Quiz Number E.4 Quiz Number E.5 Quiz Number E.6 Quiz Number

1 2 3 4 5 6

. . . . . .

. . . . . .

. . . . . .

. . . . . .

F 2011 Quizzes: Solutions F.1 Quiz Number 1 . . . . F.2 Quiz Number 2 . . . . F.3 Quiz Number 3 . . . . F.4 Quiz Number 5 . . . . F.5 Quiz Number 6 . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

. . . . . . .

. . . . . . .

. . . . . .

. . . . .

. . . . . . . .

111 111 113 114 116 118 120 122 123

. . . . . . .

125 125 126 127 128 129 130 131

. . . . . . .

133 133 135 136 138 139 140 141

. . . . . .

143 143 144 145 146 147 148

. . . . .

149 149 151 153 155 157

Chapter 1 Basics of Probability Theory Goals of EE387 • Introduce the basics of probability theory, • Apply probability theory to solve engineering problems. • Develop intuition into how the theory applies to practical situations.

1.1

Set theory

A set can be described by the tabular method or the description method. Two special sets: (1) The universal set S and (2) The null set ϕ.

1.1.1

Basic Set Operations

|A|: cardinality of A. A ∪ B = {x|x ∈ A or x ∈ B}: union - Either A or B occurs or both occur. A ∩ B = {x|x ∈ A and x ∈ B}: intersection - both A and B occur. A − B = {x ∈ A and x ∈ / B}: set difference Ac = {x | x ∈ S and x ∈ / A}: complement of A. n ∪

k=1 n ∩

Ak = A1 ∪ A2 ∪ . . . ∪ An : Union of n ≥ 2 events - one or more of Ak ’s occur. Ak = A1 ∩ A2 ∩ . . . ∩ An : Intersection of n ≥ 2 events - all Ak ’s occur simulta-

k=1

neously. Definition 1.1: A and B are disjoint if A ∩ B = ϕ. Definition 1.2: A collection of events A1 , A2 , . . . , An (n ≥ 2) is mutually exclusive if all pairs of Ai and Aj (i ̸= j) are disjoint.

2

Basics of Probability Theory

1.1.2

Algebra of Sets

1. Union and intersection are commutative. 2. Union and intersection are distributive. 3. (A ∪ B)c = Ac ∩ B c - De Morgan’s law. 4. Duality Principle

1.2

Applying Set Theory to Probability

Definition 1.3: An experiment consists of a procedure and observations. Definition 1.4: An outcome is any possible observation of an experiment. Definition 1.5: The sample space S of an experiment is the finest-grain, mutually exclusive, collectively exhaustive set of all possible outcomes. Definition 1.6: An event is a set of outcomes of an experiment. Definition 1.7: A set of mutually exclusive sets (events) whose union equals the sample space is an event space of S. Mathematically, Bi ∩ Bj = ϕ for all i ̸= j and B1 ∪ B2 ∪ . . . ∪ Bn = S. Theorem 1.1: For an event space B = {B1 , B2 , · · · , Bn } and any event A ⊂ S, let Ci = A ∩ Bi , i = 1, 2, · · · , n. For i ̸= j, the events Ci and Cj are mutually exclusive, i.e., Ci ∩ Cj = ϕ, and A =

n ∪

Ci .

i=1

1.3

Probability Axioms

Definition 1.8: Axioms of Probability: A probability measure P [·] is a function that maps events in S to real numbers such that: Axiom 1. For any event A, P [A] ≥ 0. Axiom 2. P [S] = 1. Axiom 3. For any countable collection A1 , A2 , · · · of mutually exclusive events P [A1 ∪ A2 ∪ · · · ] = P [A1 ] + P [A2 ] + · · ·

1.4 Some Consequences of Probability Axioms Theorem 1.2: P [A] =

m ∑

3

If A = A1 ∪ A2 ∪ · · · ∪ Am and Ai ∩ Aj = ϕ for i ̸= j, then

P [Ai ].

i=1

Theorem 1.3: The probability of an event B = {s1 , s2 , · · · , sm } is the sum of the probabilities of the outcomes in the event, i.e., P [B] =

m ∑

P [{si }].

i=1

1.4

Some Consequences of Probability Axioms

Theorem 1.4: The probability measure P [·] satisfies 1. P [ϕ] = 0. 2. P [Ac ] = 1 − P [A]. 3. For any A and B (not necessarily disjoint), P [A∪B] = P [A]+P [B]−P [A∩B]. 4. If A ⊂ B, then P [A] ≤ P [B]. Theorem 1.5: For any event A and event space B = {B1 , B2 , · · · , Bm } , P [A] =

m ∑

P [A ∩ Bi ].

i=1

1.5

Conditional probability

The probability in Section 1.3 is also called a priori probability. If an event has happened, this information can be used to update the a priori probability. Definition 1.9: The conditional probability of event A given B is P [A|B] =

P [A ∩ B] . P [B]

To calculate P [A|B], find P [A ∩ B] and P [B] first. Theorem 1.6 (Law of total probability): {B1 , B2 , · · · , Bm } with P [Bi ] > 0 for all i, P [A] =

m ∑

For an event space

P [A|Bi ]P [Bi ].

i=1

Theorem 1.7 (Bayes’ Theorem): P [B|A] =

P [A|B]P [B] . P [A]

4

Basics of Probability Theory

Theorem 1.8 (Bayes’ Theorem- Expanded Version): P [A|Bi ]P [Bi ] P [Bi |A] = ∑m . i=1 P [A|Bi ]P [Bi ]

1.6

Independence

Definition 1.10: Events A and B are independent if and only if P [A ∩ B] = P [A]P [B]. Relationship with conditional probability: P [A|B] = P [A], P [B|A] = P [B] when A and B are independent. Definition 1.11: Events A, B and C are independent if and only if P [A ∩ B] = P [A]P [B] P [B ∩ C] = P [B]P [C] P [A ∩ C] = P [A]P [C] P [A ∩ B ∩ C] = P [A]P [B]P [C].

1.7

Sequential experiments and tree diagrams

Many experiments consist of a sequence of trials (subexperiments). Such experiments can be visualized as multiple stage experiments. Such experiments can be conveniently represented by tree diagrams. The law of total probability is used with tree diagrams to compute event probabilities of these experiments.

1.8

Counting Methods

Definition 1.12: If task A can be done in n ways and B in k way, then A and B can be done in nk ways. Definition 1.13: If task A can be done in n ways and B in k way, then either A or B can be done in n + k ways. Here are some important cases: • The number of ways to choose k objects out of n distinguishable objects (with replacement and with ordering) is nk .

1.9 Reliability Problems

5

• The number of ways to choose k objects out of n distinguishable objects (without replacement and with ordering) is n(n − 1) · · · (n − k + 1). • The number of ways to choose k objects out of n distinguishable objects ( ) n! (without replacement and without ordering) is nk = . k!(n − k)! • Number of permutations on n objects out of which n1 are alike, n2 are alike, n! . . ., nR are alike: . n1 !n2 ! · · · nR !

1.9

Reliability Problems

For n independent systems in series: P [W ] =

n ∏

P [Wi ].

i=1

For n independent systems in parallel: P [W ] = 1 −

n ∏

(1 − P [Wi ]).

i=1

1.10

Illustrated Problems

1. True or False. Explain your answer in one line. a) If A = {x2 |0 < x < 2, x ∈ R} and B = {2x|0 < x < 2, x ∈ R} then A = B. b) If A ⊂ B then A ∪ B = A c) If A ⊂ B and B ⊂ C then A ⊂ C d) For any A, B and C, A ∩ B ⊂ A ∪ C e) There exist a set A for which (A ∩ ∅c )c ∩ S = A (S is the universal set). f) For a sample space S and two events A and C, define B1 = A ∩ C,B2 = Ac ∩ C, B3 = A ∩ C c and B4 = Ac ∩ C c . Then {B1 , B2 , B3 , B4 }is an event space. 2. Using the algebra of sets, prove a) A ∩ (B − C) = (A ∩ B) − (A ∩ C), b) A − (A ∩ B) = A − B. 3. Sketch A − B for a) A ⊂ B, b) B ⊂ A, c) A and B are disjoint.

6

Basics of Probability Theory 4. Consider the following subsets of S = {1, 2, 3, 4, 5, 6}: R1 = {1, 2, 5}, R2 = {3, 4, 5, 6}, R3 = {2, 4, 6}, R4 = {1, 3, 6}, R5 = {1, 3, 5}. Find: a) R1 ∪ R2 , b) R4 ∩ R5 , c) R5c , d) (R1 ∪ R2 ) ∩ R3 , e) R1c ∪ (R4 ∩ R5 ), f) (R1 ∩ (R2 ∪ R3 ))c , g) ((R1 ∪ R2c ) ∩ (R4 ∪ R5c ))c h) Write down a suitable event space. 5. Express the following sets in R as a single interval: a) ((−∞, 1) ∪ (4, ∞))c , b) [0, 1] ∩ [0.5, 2], c) [−1, 0] ∪ [0, 1]. 6. By drawing a suitable Venn diagram, convince yourself of the following: a) A ∩ (A ∪ B) = A, b) A ∪ (A ∩ B) = A. 7. Three telephone lines are monitored. At a given time, each telephone line can be in one of the following three modes: (1) Voice Mode, i.e., the line is busy and someone is speaking (2) Data Mode, i.e., the line is busy with a modem or fax signal and (3) Inactive Mode, i.e., the line is not busy. We show these three modes with V, D and I respectively. For example if the first and second lines are in Data Mode and the third line is in Inactive Mode, the observation is DDI. a) Write the elements of the event A= {at least two Voice Modes} b) Write the elements of B= {number of Data Modes > 1+ number of Voice modes} 8. The data packets that arrive at an Internet switch are buffered to be processed. When the buffer is full, the arrived packet is dropped and the transmission must be repeated. To study this system, at the arrival time of any new packet, we observe the number of packets that are already stored in the buffer. Assuming that the switch can buffer a maximum of 5 packets, the used buffer at any given time is 0, 1, 2, 3, 4 or 5 packets. Thus the sample space for this experiment is S = {0, 1, 2, 3, 4, 5}. This experiment is repeated 500 times and the following data is recorded.

1.10 Illustrated Problems Used buffer 0 1 2 3 4 5

7 Number of times observed 112 119 131 85 43 10

The relative frequency of an event A is defined as nnA , where nA is the number of timesA occurs and n is the total number of observations. a) Consider the following three exclusively mutual events:A = {0, 1, 2}, B = {3, 4}, C = {5}. Find the relative frequency of these events. b) Show that the relative frequency of A ∪ B ∪ C is equal to the sum of the relative frequencies of A, B and C. 9. Consider an elevator in a building with four stories, 1-4, with 1 being the ground floor. Three people enter the elevator on floor 1 and push buttons for their destination floors. Let the outcomes be the possible stopping patterns for all passengers to leave the elevator on the way up. For example, 2-2-4 means the elevator stops on floors 2 and 4. Therefore, 2-2-4 is an outcome in S. a) List the sample space, S, with its elements (outcomes). b) Consider all outcomes equally likely. What is the probability of each outcome? c) Let E = {stops only on even floors} and T = {stops only twice}. Find P[E] and P[T ]. d) Find P [E ∩ T ] e) Find P [E ∪ T ] f) Is P [EU T ] = P [E] + P T ]? Does this contradict the third axiom of probability? 10. This problem requires the use of event spaces. Consider a random experiment and four events A, B, C, and D such that A and B form an event space and also C and D form an event space. Furthermore, P [A ∩ C] = 0.3 and P [B ∩ D] = 0.25. a) Find P [A ∪ C]. b) If P [D] = 0.58, find P [A]. 11. Prove the following inequalities:

8

Basics of Probability Theory a) P [A ∪ B] ≤ P [A] + P [B]. b) P [A ∩ B] ≥ P [A] + P [B] − 1. 12. This problem requires the law of total probability and conditional probability. A study on relation between the family size and the number of cars reveals the following probabilities. Number of Cars Family size S: Small (2 or less) M: Medium (3, 4 or 5) L: Large(more than 5)

0

1

2

More than 2

0.04 0.02 0.01

0.14 0.33 0.03

0.02 0.23 0.13

0.00 0.02 0.03

Answer the following questions: a) What is the probability of a random family having less than 2 cars? b) Given that a family has more than 2 cars, what is the probability that this family be large? c) Given that a family has less than 2 cars, what is the probability that this family be large? d) Given that the family size is not medium, what is the probably of having one car? 13. A communication channel model is shown Fig. 1.1. The input is either 0 or 1, and the output is 0, 1 or X, where X represents a bit that is lost and not arrived at the channel output. Also, due to noise and other imperfections, the channel may transmit a bit in error. When Input = 0, the correct output (Output = 0) occurs with a probability of 0.8, the incorrect output (Output = 1) occurs with a probability of 0.1, and the bit is lost (Output = X) with a probability of 0.1. When Input = 1, the correct output (Output = 1) occurs with a probability of 0.7, the wrong output (Output = 0) occurs with a probability of 0.2, and the bit is lost (Output = X) with a probability of 0.1. Assume that the inputs 0 and 1 are equally likely (i.e. P [0] = P [1]). a) If Output = 1, what is the probability of Input = 1? b) If the output is X, what is the probability of Input = 1, and what is the probability of Input = 0? c) Repeat part a), but this time assume that the inputs are not equally likely and P [0] = 3P [1]. 14. This problem requires Bayes’ theorem. Considering all the other evidences Sherlock was 60% certain that Jack is the criminal. This morning, he found

1.10 Illustrated Problems

Input 0

9

Output 0 X

1

1

Figure 1.1: Communication Channel

another piece of evidence proving that the criminal is left handed. Dr. Watson just called and informed Sherlock that on average 20% of people are left handed and that Jack is indeed left handed. How certain of the guilt of Jack should Sherlock be after receiving this call? 15. This problem requires Bayes’ theorem. Two urns A and B each have 10 balls. Urn A has 3 green, 2 red and 5 white balls and Urn B has 1 green, 6 red and 3 white balls. One urn is chosen at (equally likely) and one ball is drawn from it (balls are also chosen equally likely) a) What is the probability that this ball is red? b) Given that the drawn ball is red, what the probability that Urn A was selected? c) Suppose the drawn ball is green. Now we return this green ball to the other urn and draw a ball from it (from the urn that received the green ball). What is the probability that this ball is red? 16. Two urns A with 1 blue and 6 red balls and B with 6 blue and 1 red balls are present. Flip a coin. If the outcomes is H, put one random ball from A in B, and if the outcome is T , put one random ball from B in A. Now draw a ball from A. If blue, you win. If not, draw a ball from B, if blue you win, if red, you lose. What is the probability of wining this game? 17. Two coins are in an urn. One is fair with P [H] = P [T ] = 0.5, and one is biased with P [H] = 0.25 and P [T ] = 0.75. One coin is chosen at random (equally likely) and is tossed three times. a) Given that the biased coin is selected what is the probability of T T T ? b) Given that the biased coin is selected and that the outcome of the first tree tosses in T T T , what is the probability that the next toss is T ?

10

Basics of Probability Theory B

A

C

Figure 1.2: for question 20. c) This time, assume that we do not know which coin is selected. We observe that the first three outcomes are T T T . What is the probability that the next outcome is T ? d) Define two events E1: the outcomes of the first three tosses are T T T ; E2: the forth toss is T . Are E1 and E2 independent? e) Given that the biased coin is selected, are E1 and E2 independent? 18. Answer the following questions about rearranging the letters of the word “toronto” a) How many different orders are there? b) In how many of them does ‘r’ appear before n? c) In how many of them the middle letter is a consonant? d) How many do not have any pair of consecutive ‘o’s? 19. Consider a class of 14 girls and 16 boys. Also two of the girls are sisters. A team of 8 players are selected from this class at random. a) What is the probability that the team consists of 4 girls and 4 boys? b) What is the probability that the team be uni-gender (all boys or all girls)? c) What is the probability that the number of girls be greater than the number of boys? d) What is the probability that both sisters are in the team? 20. In the network (Fig. 1.2), a data packet is sent from A to B. In each step, the packet can be sent one block either to the right or up. Thus, a total of 9 steps are required to reach B. a) How many paths are there from A to B? b) If one of these paths are chosen randomly (equally likely), what is the probability that it pass through C?

1.11 Solutions for the Illustrated Problems

a

R 1

R

R

R

11

b

Figure 1.3: Question 22. 21. A binary communication system transmits a signal X that is either a + 2 voltage signal or a − 2 voltage signal. These voltage signals are equally likely. A malicious channel reduces the magnitude of the received signal by the number of heads it counts in two tosses of a coin. Let Y be the resulting signal. a) Describe the sample space in terms of input-output pairs. b) Find the set of outcomes corresponding to the event ‘transmitted signal was definitely +2’. c) Describe in words the event corresponding to the outcome Y = 0. d) Use a tree diagram to find the set of possible input-output pairs. e) Find the probabilities of the input-output pair. f) Find the probabilities of the output values. g) Find the probability that the input was X = +2 given that Y = k for all possible values of k. 22. In a communication system the signal sent from point a to point b arrives along two paths in parallel (Fig. 1.3). Over each path the signal passes through two repeaters in series. Each repeater in Path 1 has a 0.05 probability of failing (because of an open circuit). This probability is 0.08 for each repeater on Path 2. All repeaters fail independently of each other. a) Find the probability that the signal will not arrive at point b.

1.11 1.

Solutions for the Illustrated Problems a) True. They both contain all real numbers between 0 and 4. b) False. A ∪ B = B c) True. ∀x ∈ A ⇒ x ∈ B ⇒ x ∈ C, therefore: A ⊂ C. d) True. Because (A ∩ B) ⊂ A and A ⊂ A ∪ C. e) False. (A ∩ ϕc )c = (A ∩ S)c = (A)c = Ac . There is no set A such thatA = Ac .

12

Basics of Probability Theory f) True. Bi s are mutually exclusive and collectively exhaustive. 2.

a) Starting with the left hand side we have: A ∩ (B − C) = A ∩ (B ∩ C c ) = (A ∩ B) ∩ C c = (A ∩ B) ∩ C c = (A ∩ B) − C. For the right hand side we have: (A ∩ B) − (A ∩ C) = (A ∩ B) ∩ (A ∩ C)c = (A ∩ B) ∩ (Ac ∪ C c ) = (A ∩ B ∩ Ac ) ∪ (A ∩ B ∩ C c ) We also know that (A ∩ B ∩ Ac ) = ((A ∩ Ac ) ∩ B) = ϕ and (A ∩ B ∩ C c ) = ((A ∩ B) ∩ C c ) = (A ∩ B) − C. As a result we have: (A ∩ B) − (A ∩ C) = ϕ ∪ ((A ∩ B) − C) = (A ∩ B) − C. Then both sides are equal to (A ∩ B) − C, and therefore the equality holds. b) A − (A ∩ B) = A ∩ (A ∩ B)c = A ∩ (Ac ∪ B c ) = (A ∩ Ac ) ∪ (A ∩ B c ) We know that A ∩ Ac = ϕ. Thus we have: A − (A ∩ B) = ϕ ∪ (A ∩ B c ) = A ∩ B c = A − B

3.

a) null set b) A (A – B)

B

c) S A (A – B)

4.

a) R1 ∪ R2 = {1, 2, 3, 4, 5, 6} b) R4 ∩ R5 = {1, 3}

B

1.11 Solutions for the Illustrated Problems

13

c) R5c = {2, 4, 6} d) (R1 ∪ R2 ) ∩ R3 = {2, 4, 6} e) R1c ∪ (R4 ∩ R5 ) = {1, 3, 4, 6} f) (R1 ∩ (R2 ∪ R3 ))c = {1, 3, 4, 6} g) ((R1 ∪ R2c ) ∩ (R4 ∪ R5c ))c = {3, 4, 5, 6} h) One solution is {1,2,3} and {4,5,6} which partition S to two disjoint sets. 5.

a) ((−∞, 1) ∪ (4, ∞))c = [1, 4] b) [0, 1] ∩ [0.5, 2] = [0.5, 1] c) [−1, 0] ∪ [0, 1] = [−1, 1]

6. Try drawing Venn diagrams 7. A = {V V I, V V D, V V V, V IV, V DV, IV V, DV V } B = {DDD, DDI, DID, IDD} 8.

a)

b)

9.

nA = 112+119+131 = 0.724 n 500 nB 85+43 = = 0.256 n 500 nC 10 = = 0.02 n 500 nA∪B∪C = 500 =1 n 500 nA nB nC + + = 0.724 + 0.256 n n n

+ 0.02 = 1 =

nA∪B∪C n

a) S = {2 − 2 − 2, 2 − 2 − 3, 2 − 2 − 4, 2 − 3 − 3, 2 − 3 − 4, 2 − 4 − 4, 3 − 3 − 3, 3 − 3 − 4, 3 − 4 − 4, 4 − 4 − 4} b) There are 10 elements in S, thus the probability of each outcome is 1/10. To be mathematically rigorous, one can define 10 mutually exclusive outcomes: E1 = {2−2−2}, E2 = {2−2−3}, . . ., E10 = {4−4−4}. These outcomes are also collectively exhaustive. Thus, using the second and the third axioms of probability, P [E1 ] + P [E2 ] + ...P [E10 ] = P [S] = 1. Now, since these outcomes are equally likely, each has P [Ei ] = 1/10. c) E = {2 − 2 − 2, 2 − 2 − 4, 2 − 4 − 4, 4 − 4 − 4}, T = {2 − 2 − 3, 2 − 2 − 4, 2 − 3 − 3, 2 − 4 − 4, 3 − 3 − 4, 3 − 4 − 4} Thus, P [E] = 4/10, P [T ] = 6/10 d, e) E ∩ T = {2 − 2 − 4, 2 − 4 − 4} E ∪ T = {2 − 2 − 2, 2 − 2 − 3, 2 − 2 − 4, 2 − 3 − 3, 2 − 4 − 4, 3 − 3 − 4, 3 − 4 − 4, 4 − 4 − 4} Thus, P [E ∩ T ] = 2/10, P [E ∪ T ] = 8/10.

14

Basics of Probability Theory f) It can be seen that P [E ∪ T ] ̸= P [E] + P [T ]. This does not contradicts the third axiom, because the third axiom on only for mutually exclusive (in this case, disjoint) events. E and T are not disjoint.

A = B c and C = Dc

10.

a) P [B ∩ D] = P [C c ∩ Ac ] = P [(A ∪ C)c ] = 1 − P [A ∪ C] ⇒ P [A ∪ C] = 0.75 b) P [A ∪ C] = P [A] + P [C] − P [A ∩ C] ⇒ P [A] = P [A ∪ C] − P [C] + P [A ∩ C] P [C] = 1 − P [D] = 0.42 ⇒ P [A] = 0.75 − 0.42 + 0.3 = 0.63

11.

a)

}

P [A ∪ B] = P [A] + P [B] − P [A ∩ B] ⇒ P [A ∪ B] ≤ P [A]+P [B] P [A ∩ B] ≥ 0 Notice that from a) it can easily be concluded that P [A ∪ B ∪ C ∪ · · ·] ≤ P [A] + P [B] + P [C] + · · · b) P [A ∪ B] = P [A] + P [B] − P [A ∩ B] P [A ∪ B] ≤ 1

}

⇒ P [A]+P [B]−P [A ∩ B] ≤

1 ⇒ P [A ∩ B] ≥ P [A] + P [B] − 1

12.

a) We define A to be the event that a random family has less than two cars and N to be number of cars. P [A ∩ S] = P [N = 0 ∩ S] + P [N = 1 ∩ S] = 0.04 + 0.14 = 0.18 P [A ∩ M ] = P [N = 0 ∩ M ] + P [N = 1 ∩ M ] = 0.02 + 0.33 = 0.35 P [A ∩ L] = P [N = 0 ∩ L] + P [N = 1 ∩ L] = 0.01 + 0.03 = 0.04 P [A] = P [A ∩ S] + P [A ∩ M ] + P [A ∩ L] = 0.18 + 0.35 + 0.04 = 0.57 >2)] P [L∩(N >2)] = P [L∩(N >2)]+P b) P [ L| N > 2] = P [L∩(N P [N >2] [M ∩N >2]+P [S∩(N >2)] 0.03 ⇒ P [ L| N > 2] = 0.03+0.02+0 = 0.6 175] P [X>165]

a) FX (x| |X| > 2) =

0.5 Φ(1)

P [165 165] =

11.

=

P [|X|>2,X≤x] P [|X|>2]

4

=

=

0.5 0.8413

2Φ(1)−1 Φ(1)

 F (x)  ,    P [|X|>2]

=

−2

= 0.5943 =

F (−2) , P [|X|>2]     F (x)−P [|X|2]

0.6826 0.8413

= 0.8114

x < −2 −2 ≤ x ≤ 2 x>2

P [|X| > 2] = e Therefore, the PDF is equal to  2−|x| e    2 , x < −2 fX (x| |X| > 2) = 0, −2 ≤ x ≤ 2   e2−||x| , x>2 2 b) E (X| |X| > 2) =

∫∞ −∞

x fX (x| |X| > 2) dx = 0 (Due to the symmetry of

the conditional PDF) 12.

a) fX (x) =  0.1e−x/10 , x > 0 y−1 y−1  fX ( 3 ) e− 30 = , y>1 3 30 fY (y) = 0, otherwise b) fZ (z) =

 √   fX (√ z )  0,

2 z

c) E[Z] E[Z 2 ]

= =



=

z

10 e− √ 20 z

, z>0 otherwise

∫∞ −∞ ∫∞ −∞

x2 fx (x) dx = x4 fx (x)dx =

1 10 1 10

∫∞ 0 ∫∞ 0

x2 e−x/10 dx = 100 × Γ(3) = 200

x4 e−x/10 d = 10000 × Γ(5) = 240000

VAR[Z] = 240000 − 40000 = 200000 13. Given X ∼ Uniform(0, 2)

68

Continuous Random Variables E[Y ]

= E[g(X)] = =

1/2 ∫

( ) 1 2

2x

0

E[Y 2 ]

∫∞

1/2 ∫

(2x)2

( ) 1 2

0

∫1

dx +

= E[g 2 (X)] = =

g(x)fX (x) dx

−∞

(2 − 2x)

( )

1/2

∫∞ −∞

1 2

dx =

1 4

(g(x))2 fX (x)dx ∫1

dx +

(2 − 2x)2

1/2

VAR[Y ] = E[Y ] − (E[Y ]) = 2

2

1 6



( )2 1 4

( ) 1 2

dx =

1 6

= 0.1042

For the exponential case For X ∼ Exponential(λ), we know that E[X] = λ1 . Therefore, we find λ = 2 and PDF fX (x) = 2e−2x , x ≥ 0.

∫∞

E[Y ] = E[g(X)] =

g(x)fX (x)dx −∞

∫1/2

=

(

2x 2e

−2x

)

∫1

dx +

0

(

)

(2 − 2x) 2e−2x dx = 0.3996

1/2

∫∞

E[Y 2 ] = E[g 2 (X)] =

(g(x))2 fX (x)dx −∞

∫1/2 2

=

(2x)

(

−2x

2e

)

∫1

dx +

0

(

)

(2 − 2x)2 2e−2x dx = 0.2578

1/2

VAR[Y ] = E[Y ] − (E[Y ]) = 0.09815 2

14. P [A] = P [W > 1] =

2

∫∞ 1

∫2

fW (w)dw = (w − 1)dw =

P [B] = P [0.5 < W < 1.5] =

1.5 ∫

1

fW (w)dw =

0.5

∫1 0.5

1 2

(1 − w)dw +

a) Conditioning  on A we get, 2(w − 1), 1 ≤ w ≤ 2 fW |A (w) = 0, otherwise    0,

w 2

Section 3.3 - Expected Values 4. Continuous random variable X has PDF {

fX (x) =

1/4 −1 ≤ x ≤ 3, 0 otherwise.

Define the random variable Y by Y = h(X) = X 2 . a) Find E[X] and VAR[X]. b) Find h(E[X]) and E[h(X)]. c) Find E[Y ] and VAR[Y ].

3.10 Drill Problems

71

Ans a) E[X] = 1, VAR[X] = 4/3 b) h(E[X]) = 1, E[h(x)] = 7/3 c) E[Y ] = 7/3, VAR[Y ] = 304/45 5. The cumulative function of random variable U is   0       (u + 5)/8

FU (u) =

      

u < −5, −5 ≤ u < −3, 1/4 −3 ≤ u < 3, 1/4 + 3(u − 3)/8 3 ≤ u < 5, 1 u ≥ 5.

a) What is E[U ]? b) What is VAR[U ]? c) What is E[2U ]? Ans a) 2

b) 37/3

c) 13.001

Section 3.4 - Families of RVs 6. You take bus to the university from is uniformly distributed between 40 VAR [X], (b) the probability that it probability that it takes less than 45

your home. The time X for this trip and 55 minutes. (a) Find E[X] and takes more than 50 minutes. (c) The minutes.

Ans a) E[X] = 47.5 min. Var[X] = 18.75 min2 . b)1/3 c) 1/3 7. Let X be uniform RV on [0, 2]. Compute the mean and variance of Y = g(X) where   x 6|X > 3] = P [X > 3] = 0.3012 9. The life time in months, X, of light bulbs (identical specifications) produced by two manufacturing plants A and B are exponential with λ = 1/5 and λ = 1/2, respectively. Plant B produces four times as many bulbs as plant A. The bulbs are mixed together and sold. What is the probability that a light bulb purchased at random will last at least (a) two months; (b) five months; (c) seven months. Ans a) 0.4284

b) 0.1392

c) 0.0735

10. X is a Erlang(3, 0.2). Calculate the value of P [1.3 < X < 4.6]. Ans 0.0638

Section 3.5 - Gaussian RVs 11. A Gaussian random variable, X, has a mean of 10 and a variance of 12. a) Find the probability that X is less than 13. b) Find P [−1 < X < 1]. c) If Y = 2X + 3, find the mean and variance of Y . d) Find P [0 < Y ≤ 80]. Ans a) 0.8068

b) 0.00394

c) 23, 48

d) 0.9995

12. A Gaussian random variable, X, has an unknown mean but a standard deviation of 4.

3.10 Drill Problems

73

a) The random variable is positive on 32% of the trials. What is the mean value? b) This random variable is changed to another Gaussian random variable through the linear transformation Y = X2 + 1. Find the expected value of Y . c) Find the variance of Y . d) Find the mean of the square of Y . Ans a) -1.871

b) 0.0646

c) 4

d) 4.0042

Section 3.6 - Function of a RV 13. X is a Uniform(0,4) RV. The RV Y is obtained by Y = (X − 2)2 . a) Derive the PDF and CDF of Y . b) Find E[Y ]. c) Find VAR [Y ]. Ans a)  

fY (y) =

FY (y) =

1 √ 4 y

0

   0√   

y 2

1

0≤y≤4 otherwise y4

b)E[Y]=4/3 c)Var[Y]=64/45 14. The RV X is N (1,1). Let Y =

 1, 2,

a) Find the PDF and CDF of Y . b) Find E[Y ]. c) Find VAR [Y ].

X≤0 X>0

74

Continuous Random Variables Ans a)

fY (y) = 0.5 for y ∈ {1, 2}, ZOW.   0 

y 3.7}. What are fX|B (x), µX|B , and σX|B ? Ans

 

2

fX|B (x) =  13 0

3.7 ≤ x ≤ 5 otherwise

2 µX|B = 4.35, σX|B = 0.1408

16. Let X be an exponential random variable. {

FX (x) =

1 − e−x/3 , x ≥ 0 , 0 ,x < 0

2 and let B be the event B = {X > 2}. What are fX|B (x), µX|B , and σX|B ?

Ans fX|B (x) =

 2−x 1e 3 3

0

x>2 otherwise

2 µX|B = 4.9997, σX|B = 9.0007

17. X is Gaussian with a mean of 997 and a standard deviation of 31. What is the probability of B where B = {X > 1000}? And what is the pdf for X conditioned by B? Ans P [X > 1000] = 0.4615 fX|B (x) =

 

fX (x) P [X>1000]

0

x > 1000 otherwise

Chapter 4 Pairs of Random Variables 4.1

Joint Probability Mass Function

Definition 4.1: The joint PMF of two discrete RVs X and Y is PX,Y (a, b) = P [X = a, Y = b].

Theorem 4.1: The joint PMF PX,Y (x, y) has the following properties 1. 0 ≤ PX,Y (x, y) ≤ 1 for all (x, y) ∈ SX,Y . 2.



PX,Y (x, y) = 1.

(x,y)∈SX,Y

3. For event B, P [B] =



PX,Y (x, y).

(x,y)∈B

4.2

Marginal PMFs

Theorem 4.2: For discrete RV’s X and Y with joint PMF PX,Y (x, y), the marginals are ∑ PX (x) = PX,Y (x, y) and y∈SY

PY (y) =



x∈SX

PX,Y (x, y).

76

Pairs of Random Variables

4.3

Joint Probability Density Function

Definition 4.2: The joint PDF of the continuous RVs (X, Y ) is defined (indirectly) as ∫ ∫ x

y

−∞

−∞

fX,Y (u, v)dudv = FX,Y (x, y).

Theorem 4.3: fX,Y (x, y) =

∂ 2 FX,Y (x, y) ∂x∂y

Theorem 4.4: A joint PDF fX,Y (x, y) satisfies the following two properties 1. fX,Y (x, y) ≥ 0 for all (x, y). 2.

∫∞ ∫∞

−∞ −∞

fX,Y (x, y)dxdy = 1.

Theorem 4.5: The probability that the continuous random variables (X, Y ) are in B is ∫∫ P [B] = fX,Y (x, y)dxdy. (x,y)∈B

4.4

Marginal PDFs

Theorem 4.6: For (X, Y ) pair with joint PDF fX,Y (x, y), the marginal PDFs are ∫

fX (x) =



−∞ ∞

fX,Y (x, y)dy



fY (x) =

4.5

−∞

fX,Y (x, y)dx

Functions of Two Random Variables

Theorem 4.7: For discrete RV’s X and Y , the derived random variable W = g(X, Y ) has PMF, ∑ PW (w) = PX,Y (x, y) (x,y):g(x,y)=w

i.e., a sum of all PX,Y (x, y) where x and y subject to g(x, y) = w. We use this theorem to calculate probabilities of event {W = w}.

4.6 Expected Values

77

Theorem 4.8: For continuous RV’s X and Y , the CDF of W = g(X, Y ) is FW (w) = P [W ≤ w] =

∫∫ g(x,y)≤w

fX,Y (x, y)dxdy.

It is useful to draw a picture in the plane to calculate the double integral. Theorem 4.9: For continuous RV’s X and Y , the CDF of W = max(X, Y ) is ∫

FW (w) = FX,Y (w, w) =

4.6

w

−∞



w

−∞

fX,Y (x, y)dxdy.

Expected Values

Theorem 4.10: For RV’s X and Y , the expected value of W = g(X, Y ) is E[W ] =

 ∑ ∑  g(x, y)PX,Y (x, y) 

discrete

x∈SX y∈SY  ∫ ∞ ∫ ∞ g(x, y)f X,Y (x, y)dxdy −∞ −∞

continuous

Note -the expected value of W can be computed without its PMF or PDF. Theorem 4.11: E[g1 (X, Y ) + · · · + gn (X, Y )] = E[g1 (X, Y )] + · · · + E[gn (X, Y )] Theorem 4.12: The variance of the sum of two RV’s is VAR [X + Y ] = VAR [X] + VAR [Y ] + 2E[(x − µX )(y − µY )]. Definition 4.3: The covariance of two RV’s X and Y is defined as Cov[X, Y ] = E[(X − µX )(Y − µY )] = E[XY ] − µX µY . Theorem 4.13: 1. VAR [X + Y ] = VAR [X] + VAR [Y ] + 2Cov[X, Y ]. 2. If X = Y , Cov[X, Y ] = VAR [X] = VAR [Y ] Definition 4.4: If Cov[X, Y ] = 0, RV’s X and Y are said to be uncorrelated. Definition 4.5: The correlation coefficient of RV’s X and Y is ρX,Y = Theorem 4.14: −1 ≤ ρX,Y ≤ 1.

Cov[X,Y ] . σX σY

78

4.7

Pairs of Random Variables

Conditioning by an Event

Theorem 4.15: For event B, a region in the (X, Y ) plane with P [B] > 0, {

PX,Y |B (x, y) =

PX,Y (x,y) P [B]

0

(x, y) ∈ B . otherwise

We use this to calculate the conditional joint PMF, conditioned on an event. Theorem 4.16: For continuous RV’s X and Y and event B with P [B] > 0, the conditional joint PDF of X and Y given B is {

fX,Y |B (x, y) =

fX,Y (x,y) P [B]

0

(x, y) ∈ B otherwise

. Theorem 4.17: For RV’s X and Y and an event B with P [B] > 0, the conditional expected value of W = g(X, Y ) given B is E[W |B] = .

4.8

 ∑ ∑  g(x, y)PX,Y |B (x, y) 

discrete

x∈SX y∈SY  ∫ ∞ ∫ ∞ g(x, y)f X,Y |B (x, y)dxdy −∞ −∞

continuous

Conditioning by an RV

Definition 4.6: For event {Y = y} with non-zero probability, the conditional PMF of X is PX|Y (x|y) = P [X = x|Y = y] =

P [X = x, Y = y] PX,Y (x, y) = . P [Y = y] PY (y)

Theorem 4.18: For discrete RV’s X and Y with joint PMF PX,Y (x, y) and x and y such that PX (x) > 0 and PY (y) > 0, PX,Y (x, y) = PX|Y (x|y)PY (y) = PY |X (y|x)PX (x). This allows us to derive the joint PMF from conditional joint PMF and marginal PMF.

4.9 Independent Random Variables

79

Definition 4.7: The conditional PDF of X given {Y = y} is fX|Y (x|y) =

fX,Y (x, y) fY (y)

fY |X (y|x) =

fX,Y (x, y) . fX (x)

where fY (y) > 0. Similarly,

Theorem 4.19: X and Y are discrete RV’s. Find any y ∈ SY , the conditional expected value of g(X, Y ) given Y = y is ∑

E[g(X, Y )|Y = y] =

g(x, y)PX|Y (x|y).

x∈SX

Definition 4.8: For continuous RV’s X and Y , and any y such that fY (y) > 0, the conditional expected value of g(X, Y ) given Y = y is ∫

E[g(X, Y )|Y = y] =

∞ −∞

g(x, y)fX|Y (x|y)dx.

To calculate the conditional moments, we need the conditional joint PMF and PDF first. Theorem 4.20: Iterated Expectation E[E[X|Y ]] = E[X].

4.9

Independent Random Variables

Definition 4.9: RV’s X and Y are independent if and only if Discrete: PX,Y (x, y) = PX (x)PY (y). Continuous: fX,Y (x, y) = fX (x)fY (y), for all values of x and y. Theorem 4.21: For independent RV’s X and Y , 1. E[g(X)h(Y )] = E[g(X)]E[h(Y )]. Cov[X, Y ] = 0.

→ E[XY ] = E[X]E[Y ] thus

2. VAR [X + Y ] = VAR [X] + VAR [Y ] E[X|Y = y] = E[X] E[Y |X = x] = E[Y ].

80

Pairs of Random Variables

4.10

Bivariate Gaussian Random Variables

Definition 4.10: X and Y are bivariate Gaussian with parameters µ1 , σ1 ,µ2 , σ2 , ρ if the joint PDF is fX,Y (x, y) =

1 − 2(1−ρ2 )

1 √

2πσ1 σ2 1 − ρ2

[(

e

x−µ1 σ1

)2

(

2ρ(x−µ1 )(y−µ2 ) − + σ1 σ2

y−µ2 σ2

)2 ]

,

where µ1 , µ2 can be any real numbers, σ1 > 0, σ2 > 0 and −1 < ρ < 1. Theorem 4.22: If X and Y are the bivariate Gaussian RV’s, then X is N

(µ1 ,σ12 )

√ 1 2e 2πσ2



and Y is N (µ2 ,

(y−µ2 )2 2σ 2 2

σ22 ).

That is, fX (x) =

√ 1 2e 2πσ1



(x−µ1 )2 2σ 2 1

, fY (y) =

.

Theorem 4.23: Bivariate Gaussian RV’s X and Y have the correlation coefficient ρX,Y = ρ. Theorem 4.24: Bivariate Gaussian RV’s X and Y are uncorrelated if and only if they are independent, i.e., ρ = 0 implies that X and Y are independent.

4.11

Illustrated Problems

1. The joint PMF of two random variables is given in Table 4.1. a) Find P [X < Y ]. b) Find E[Y ]. c) Find E[X|Y = 2]. Y 1 2 3 4

X

0

1

2

3

0.03 0.05 0.02 0.05

0.10 0.07 0.20 0.23

0.02 0.02 0.02 0.04

0.02 0.05 0.05 0.03

Table 4.1: for Question 1 2. The joint PDF of X and Y is given as  c(3x2 + 2y), fX,Y (x, y) =  0,

0 < x < 1, 0 < y < 1 . otherwise

4.11 Illustrated Problems

81

a) Find the value of c to ensure a valid PDF. b) Find the PDF of X. 3. The joint PDF of X and Y is  cx2 y,

0 ≤ x < 1, 0 ≤ y ≤ 2 . otherwise

fX,Y (x, y) =  0,

a) Find the value of constant c. b) Find P [Y < 1]. c) Find P [Y < X]. d) Find P [Y > X 2 ]. e) Find the PDF of V = min{X, Y }. f) Find the PDF of U = X/Y . 4. The joint PDF of X and Y is fX,Y (x, y) =

 2.5x2 , 0,

−1 ≤ x ≤ 1, 0 ≤ y ≤ x2 . otherwise

a) Find the marginal PDF of X. b) Find the marginal PDF of Y . c) Find Var[X] and Var[Y ]. d) Find Cov[X, Y ] and ρX,Y (the correlation coefficient of X and Y ). 5. Let X and Y be jointly distributed with fX,Y (x, y) =

 ke−3x−2y , 0,

x, y ≥ 0 . otherwise

a) Find k and the PDFs of X and Y . b) Find means and variances of X and Y . c) Are X and Y independent? Find ρX,Y . 6. X and Y are jointly distributed with  ke−3x−2y ,

fX,Y (x, y) = 

0,

0≤y≤x . otherwise

a) Find k and the marginal PDFs and CDFs of X and Y .

82

Pairs of Random Variables b) Find means and variances of X and Y . c) Are X and Y independent? Find ρX,Y . 7. Let Z = X + Y , where X and Y are jointly distributed with fX,Y (x, y) =

 c, 0,

x ≥ 0, y ≥ 0, x + y ≤ 1 . otherwise

a) Find the PDF and CDF of Z. b) Find the expected value and variance of Z. 8. Two random variables X and Y are jointly distributed with fX,Y (x, y) =

  (x+y) , 0,

3

0 ≤ x ≤ 1, 0 ≤ y ≤ 2 . otherwise

Let event A be defined as A = {Y ≤ 0.5}. a) Find P [A]. b) Find the conditional PDF fX,Y |A (x, y). c) Find the conditional PDF fX|A (x). d) Find the conditional PDF fY |A (y). 9. In Question 4, define B = {X > 0}. a) Find fX,Y |B (x, y). b) Find VAR[X|B]. c) Find E[XY |B]. d) Find fY |X (y|x). e) Find E[Y |X = x]. f) Find E[E[Y |X]]. 10. Let X and Y be jointly distributed with fX,Y (x, y) =

 2, 0,

0≤y≤x≤1 . otherwise

a) Find the PDF fY (y). b) Find the conditional PDF fX|Y (x|y). c) Find the conditional expected value E[X|Y = y].

4.12 Solutions for the Illustrated Problems

83

11. Let X and Y be jointly distributed with   (4x+2y) ,

fX,Y (x, y) = 

0 ≤ x ≤ 1, 0 ≤ y ≤ 1 . otherwise

3

0,

a) Find the PDFs fY (y) and fX (x). b) Find the conditional PDF fX|Y (x|y). c) Find the conditional PDF fY |X (y|x). 12. The joint PDF of two RVs is given by fX,Y (x, y) =

  |xy| , 8

0,

x2 + y 2 ≤ 1 . otherwise

Determine if X and Y are independent. 13. X and Y are independent, with X ∼ Gaussian(5, 15) and Y ∼ Uniform(4,6). a) Find E[XY ]. b) Find E[X 2 Y 2 ].

4.12 1.

Solutions for the Illustrated Problems a) P [X < Y ] = 0.03+0.05+0.07+0.02+0.20+0.02+0.05+0.23+0.04+0.03 = 0.74 (see: shaded in the table below) b) E[Y ] = 1(0.17) + 2(0.19) + 3(0.29) + 4(0.35) = 2.82 c) E[X|Y = 2] = 0(0.05)+1(0.07)+2(0.02)+3(0.05) = 0.19 with bold text in the table below) Y

X

1 2 3 4 2.

a) 1 =

∫∫

0 0.03 0.05 0.02 0.05

fX,Y (x, y)dxdy = c

∫  f X,Y (x, y)dy, b) fX (x) = 0,   1+3x2 , 0 < x < 1

=

0,

2

otherwise

1 0.10 0.07 0.20 0.23 ∫1

∫1

x=0 y=0

2 0.02 0.02 0.02 0.04

0.26 0.19

= 1.3684 (see: noted

3 0.02 0.05 0.05 0.03

P[Y] 0.17 0.19 0.29 0.35

(3x2 + 2y)dxdy = 2c ⇒ c =  ∫

1 2

0 < x < 1 c 01 (3x2 + 2y)dy, 0 < x < 1 = 0, otherwise otherwise

84

Pairs of Random Variables 3.

a)

∫∫

∫1

fX,Y (x, y)dxdy = c

∫2

x=0 y=0

b) P [Y < 1] =

∫1

∫1

1.5x2 dx ydy =

0

c) P [Y < X] =

0

∫1

1.5x2 ydydx = 0

∫1

1.5x2

0

∫2

2c 3

=1⇒c=

3 2

1 4

∫x

0

d) P [Y > X 2 ] =

x2 ydxdy = 1 ⇒

∫1

3 4

ydydx =

x2

x4 dx =

0

3 20

25 28

e) Case v < 0: both X and Y are always greater than v. ⇒ FV (v) = 0 Case v > 1: at least X is always smaller than v. ⇒ FV (v) = 1. Otherwise: ∫ ∫ FV (v) = P [V ≤ v] = P [min(X, Y ) ≤ v] = 1 − v∞ v∞ f (x, y)dxdy ∫ ∫ = 1 − v2 v1 32 x2 ydxdy = 0.25v 2 + v 3 − 0.25v 5    0,

v1 2

3

 0.5v + 3v 2 − 1.25v 4 , As a result: fV (v) =  0,

5

0 0.5

∫  f X,Y (x, y)dy, −1 ≤ x ≤ 1 a) fX (x) =  0, otherwise  2 x   ∫ 2.5x2 dy, −1 ≤ x ≤ 1

=

0  

0,

otherwise

u < 0.5 u > 0.5

4.12 Solutions for the Illustrated Problems =

 2.5x4 ,

85

−1 ≤ x ≤ 1 otherwise

0,

b) ∫  f X,Y (x, y)dx, 0 ≤ y ≤ 1 fY (y) = 0, otherwise  √ −∫ y  ∫1   2.5x2 dx + 2.5x2 dx, 0 ≤ y ≤ 1 = √y −1   0, otherwise  ( )  5 1 − y 3/2 , 0 ≤ y ≤ 1

=

3

0,

otherwise

c) E[X]

=

∫ ∫

2

E[X ]

x fX (x)dx = 2.5 2

∫1

−1 ∫1

= x fX (x)dx = 2.5 ∫

E[Y ]

= y fY (y)dy = ∫

2

E[Y ]

2

= y fY (y)d =

0

d) E[XY ] =

∫∫

(

5 14

)

y 2 1 − y 3/2 dy =

VAR[X] = E[X 2 ] − E 2 [X] = VAR[Y ] = E[Y 2 ] − E 2 [Y ] =

)

y 1 − y 3/2 dy =

0

∫1

5 3

−1

5 7

x6 dx =

(

∫1

5 3

x5 dx = 0

5 − 0 = 57 7 ( )2 5 5 − 27 14

xyfX,Y (x, y)dxdy =

∫1

x ∫2

5 27

= 0.05763

xy (2.5x2 ) dydx = 0

x=−1 0

⇒ COV [X, Y ] = E[XY ] − E[X]E[Y ] = 0 and ρx,y = √

COV [X,Y ] VAR[X]VAR[Y ]

0 5.

a)

∫∫

fX,Y (x, y)dxdy = k ∫

∫∞ ∫∞

fX (x) = fX,Y (x, y)dy = =

 3e−3x , 0,

x≥0 otherwise



e−3x−2y dxdy =

0 0

∫∞  6 e−3x−2y dy, 0

=

0,

y≥0 otherwise

=1⇒k=6 x≥0

 0,

otherwise

 ∞ ∫  6 e−3x−2y dx,

y≥0

fY (y) = fX,Y (x, y)dy =  0 0,  2e−2y ,

k 6

otherwise

=

86

Pairs of Random Variables b) E[X] =





2

E[X ] = E[Y ] =

x fX (x)dx = 3



E[Y ] =

0 2

x fX (x)dx = 3

xe−3x dx = ∫∞

1 3

x2 e−3x dx =

0

∫∞

y fY (y)dy = 2

ye−2y dx =

0



2

∫∞

∫∞

2

y fY (y)dy = 2

⇒ VAR[X] =

2 9

2 9



1 2

y 2 e−2y dy =

0

2 4

⇒ VAR[X] =

2 4



( )2 1 3

( )2 1 4

=

=

1 9

1 4

c) Yes; because fX,Y (x, y) = fX (x)fY (y) for every x and y. ∴ ρX,Y = 0 (because X and Y are uncorrelated) 6.

a)

∫∫

fX,Y (x, y)dxdy = k

∫∞ ∫x

e−3x−2y dydx =

0 y=0 

∫x  15 e−3x−2y dy,



fX (x) = fX,Y (x, y)dy =  7.5 (e−3x − e−5x ) , = 0, ∫

=



E[X 2 ] = 98 225



(

8 15

E[Y ] = 2



E[Y ] =

otherwise

 ∞ ∫  15 e−3x−2y dx,

otherwise

∫∞ 0



x (e−3x − e−5x ) dx =

x2 fX (x)dx = 7.5

)2

∫∞ 0

=

34 225

y fY (y)dy = 5

∫∞

ye−5y dx =

0

∫∞

2

y fY (y)dy = 5

8 15

x2 (e−3x − e−5x ) dx =

98 225

⇒ VAR[X] =

1 5

y 2 e−5y dy =

0

=

y≥0

y

 0,

x fX (x)dx = 7.5



x≥0

0

y≥0 otherwise

0,

b) E[X] =

= 1 ⇒ k = 15

x≥0 otherwise

fY (y) = fX,Y (x, y)dy =  5e−5y ,

 0,

k 15

2 25

⇒ VAR[X] =

2 25



( )2 1 5

1 25

c) No; because ∫f∫X,Y (x, y) ̸= fX (x)fY (y)for some x and y. E[XY ] = xy fX,Y (x, y)dxdy = 15 = 15 = 15

∫∞ ∫x

xye−3x−2y dydx

x=0 y=0 ∫∞ −3x

(

xe

x=0 ∫∞

xe−3x

x=0

(

∫x

)

ye−2y dy dx

y=0 1−(1+2x)e−2x 4

)

dx =

∴ By definition: E[XY ]−E[X]E[Y ] √ ρX,Y = √ = 11/75−(8/15)(1/5) = VAR[X]VAR[Y ]

(34/225)(1/25)

11 75

√1/25 34/75

=

√3 34

= 0.5145

4.12 Solutions for the Illustrated Problems 7.

a)

∫∫

∫1

fX,Y (x, y)dxdy = c

1−y ∫

dxdy =

y=0 x=0

Consider the CDF ofZ.   0, FZ (z) = P [Z ≤ z] =

 

2

    

c 2

z 1

=1⇒c=2 z1

2

d F (z) dz Z

fZ (z) = b) E[Z] =

∫1 0

2

E[Z ] =

=

2z 2 dz =

∫1

0.5 ∫

a) P [A] =

∫1

y=0 x=0

b) fX,Y |A (x, y) = =

  8(x+y) , 0,

1 2

⇒ VAR[Z] =

fX,Y (x, y)dxdy =

0.5 ∫ 0

A is True

0,

A is False

1/8



( )2 2 3

=

1 18

1 8

  fX,Y (x,y) ,

  4x+1 ,

fX,Y |A (x, y)dy =  3 0,

∫1

  4(1+2y) ,

0

0,

d) fY |A (y) = fX,Y |A (x, y)dx = 9.

1 2

0 ≤ x ≤ 1, 0 ≤ y ≤ 0.5 otherwise

3

c) fX|A (x) =

0,

0≤z≤1 otherwise

2 3

2z 3 dz =

0

8.

 2z,

0≤x≤1 otherwise

3

0 ≤ y ≤ 0.5 otherwise

a) Because of the symmetry around the y axis, it is easy to see that P [B] = 0.5. Thus:  2 2f X,Y (x, y), 0 < x ≤ 1, 0 ≤ y ≤ x fX,Y |B (x, y) = 0, otherwise =

 5x2 , 0,

0 < x ≤ 1, 0 ≤ y ≤ x2 otherwise ∫

b) fX|B (x) = fX,Y |B (x, y)dy =

 2 x   ∫ 5x2 dy, 0  

0,

0 10])2 = 0.04767, using CLT c) Using the exact distributions ˆ and Yˆ are independent, their joint PMF is given by P ˆ ˆ (j, k) = Since X X,Y ( )( ) 400 200 200 . 0.5 k j √ √ We[ are required to find P [ X 2 + Y 2] > 10 2]; or equivalently ˆ − 200)2 + (2Yˆ − 200)2 > 200 . This is equivalent to finding the P (2X sum of probability masses lying outside the circle. ∑ Therefore, the desired probability is: (k,j)∈ℵ PX, ˆ Yˆ (j, k) . where: ℵ = {(j, k) | j, k ∈ {0, . . . , 200}, (2j−200)2 +(2k−200)2 > 200}. Hint: you may use the following MATLAB code to compute this value to be 0.59978.

100

Sums of Random Variables p = 0; for j = 0:200 for k = 0:200 if (2*j-200)^2 + (2*k-200)^2 > 200 p = p + nchoosek(200,k)* nchoosek(200,j); end end end y = y * (0.5^400) Using CLT approximation √ Let R = X 2 + Y 2 . Since X, Y ∼ Gaussian(0, 200), √ and X and Y are independent, R is Rayleigh with parameter σ = 10 2. √ √ 2 Therefore, P [R > 10 2] = e−(10 2/σ) /2 = e−0.5 = 0.60653.

5.4

Drill Problems Section 5.1.1 - PDF of the sum of 2 RVs

1. X ∼ Uniform(0, 1) and Y ∼ Uniform(0, 2) are independent RVs. Compute the PDF fW (w) of W = X + Y . Ans

   0.5w,   0.5,

0≤w 12

1 2

B.8 Quiz Number 8

B.8

123

Quiz Number 8 {

1. Random variables X and Y have joint PDF fX,Y (x, y) = Let W =

2 0

if 0 ≤ y ≤ x ≤ 1 otherwise

2X . Y

(a) What is the range of W. Solution: Consider w =

2x , y

the relationship a particular instantiation w of RV

W has with corresponding instantiations: x, y of X and Y . Since y ≤ x, w is minimal when x = y. Thus, min(w) = 2. Maximum value of w is found when y → 0 while x ̸= 0. Therefore, max(w) → ∞. Being a smooth function of x and y, w exists for all intermediate values. Thus, the range of W is [2, ∞). (b) Find FW (a). Be sure to consider all cases for −∞ < a < ∞. Solution: Y (1, 1)

y=x y = 2x/a (for: a ≥ 2) (0, 0)

x

X

Figure B.1: region of integration (for case: a ≥ 2)

Since x and y are non-negative, FW (a) = 0 whenever a < 0. Consider a ≥ 0 case. By definition,  [ ] [ ] 0, 0≤a 0].

(c) [2 marks] Define Y = X/4 + 6. Find, the average and variance of Y.

(d) [1 mark] Define W = AX + B. Find A and B such that W is a Gaussian with mean zero and variance 4.

2. Consider X ∼ Exponential(2) and Y = X 3 . (a) [1 mark] Find the range of Y.

(b) [3 marks] Find the PDF of Y .

C.7 Quiz Number 7

C.7

131

Quiz Number 7

1. The joint pdf of two random variable is given as {

fX,Y (x, y) =

cx 0 ≤ y/2 ≤ x ≤ 1 0 elsewhere

[2 marks] Find c.

[2 marks] Find the marginal PDF of X, fX (x). Be sure to consider the whole range −∞ < x < ∞.

[2 marks] Find the marginal PDF of Y , fY (y). Be sure to consider the whole range −∞ < y < ∞.

[4 marks] Find the PDF of Z = −∞ < a < ∞.

Y , X

fZ (a). Be sure to consider the whole range

Appendix D 2010 Quizzes: Solutions D.1

Quiz Number 1

1. [2 marks] Use algebra of sets to prove (A − B) − C = A − (B ∪ C). Solution: (A−B)−C = (A∩B c )∩C c = A∩(B c ∩C c ) = A∩(B ∪C)c = A−(B ∪C)

2. A fair die is rolled twice and the sum is recorded. a) [1 mark] Give the sample space (S) of this experiment. Solution: S = {2, 3, . . . , 12} b) [1 mark] Are the outcomes of this sample space equally likely? Explain. Solution: No. Some outcomes are more likely than others. For example 2 can just be the outcome when both die rolls result in 1 (i.e., (x1 , x2 ) = (1, 1)), while 7 is the outcome of the experiment when any of the pairs (1, 6), (2, 5), (3, 4), (4, 3), (5, 2), or (6, 1) happens (i.e., (x1 , x2 ) ∈ {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}). c) [1 mark] Let B={sum is less than or equal to 2}. Find P [B]. Solution: Sum less than or equal to 2, means both die rolls should have resulted in 1. Hence we have, P [B] = P {(x1 , x2 ) = (1, 1)} = P (x1 = 1) × P (x2 = 1) = 1 1 × 16 = 36 . Note that the die rolls are independent. 6

134

2010 Quizzes: Solutions

3. In a company 40% of employees are female. Also, 15% of the male (M) employees and 10% of female (F) employees hold managerial positions. a) [2 marks] Let A be the event that a randomly selected employee of this company holds a managerial position. Find P [A]? Solution: P [A] = P [A|F ]P [F ] + P [A|F c ]P [F c ] = 0.1 × 0.4 + 0.15 × 0.6 = 0.13 b) [1 mark] In part (a), what is the probability that the employee does not have a managerial position? Solution: P [Ac ] = 1 − P [A] = 1 − 0.13 = 0.87 c) [2 marks] A randomly selected employee is found to have a managerial position. What is the probability that this person is female? Solution: P [F |A] =

P [A|F ]P [F ] P [A]

=

0.1×0.4 0.13

=

4 13

≃ 0.3077

D.2 Quiz Number 2

D.2

135

Quiz Number 2

1. [5 marks] Events A and B are independent and events A and C are disjoint (mutually exclusive). Let P [A] = 0.2, P [B] = 0.4, and P [C] = 0.1. Please answer the following parts: a) [1 mark] Find P [A ∪ B]. Solution: P [A ∪ B] = P [A] + P [B] − P [A ∩ B] = P [A] + P [B] − P [A]P [B] = 0.2 + 0.4 − 0.08 = 0.52 b) [1 mark] Find P [A|B]. Solution: P [A|B] =

P [A∩B] P [B]

=

P [A]P [B] P [B]

=

0.2×0.4 0.4

=

0.08 0.4

= 0.2

c) [1 mark] Find P [Ac ∪ B]. Solution: P [Ac ∪ B] = P [Ac ] + P [B] − P [Ac ∩ B] = (1 − P [A]) + P [B] − (1 − P [A])P [B] = (1 − 0.2) + 0.4 − (1 − 0.2) × 0.4 = 0.88 d) [1 mark] Find P [A|C]. Solution: P [A|C] =

P (A∩C) P [C]

=

P [∅] P [C]

=0

e) [1 mark] Find P [A ∩ B ∩ C]. Solution: P [A ∩ B ∩ C] = P [(A ∩ C) ∩ B] = P [∅ ∩ B] = P [∅] = 0 ( )

2. For this question, you may leave your answers as ratios of nk terms. From a class of 20 boys and 10 girls a team of 5 is selected. a) [1 mark] Find the probability that the team consists of 2 boys and 3 girls. Solution:

(202)×(103) (305)

b) [2 marks] Find the probability that the majority of the team members are girls. Solution:

(202)×(103)+(201)×(104)+(200)×(105) (305)

c) [2 mark] There are 6 students in this class, that do not like to be in the team. Find the probability that the randomly chosen team has none of these 6 students. Solution:

(30−6 5 ) (305)

136

D.3

2010 Quizzes: Solutions

Quiz Number 3

1. A discrete random variable X has the following probability mass function (PMF)   A x = −4      x = −1  A 0.3 x=0 PX (x) =    0.3 x = 4     0 otherwise. (a) (1 mark) Find A. Solution: 0.3 + 0.3 + A + A = 1 ⇒ A = 0.2 (b) (3 marks) Sketch the PMF. Find FX (0.5), where FX (·) represents the CDF of X.

Solution: Fx (0.5) = 0.2 + 0.2 + 0.3 = 0.7 (c) (1 marks) Find P [0.5 < X ≤ 3]. Solution: No mass ⇒ P [0.5 < X ≤ 3] = 0 (d) (1 mark) Find P [X > 2]. Solution: P [X > 2] = 0.3 2. (4 marks) A biased coin with P [T ] = 0.2 and P [H] = 0.8 is tossed repeatedly. Identify the type of the random variable (for example, X ∼Binomial(10,0.1)) in each of the following cases. a) X is the number of tosses before the first H (inclusive).

D.3 Quiz Number 3

137

Solution: Geometric(0.8) b) X is the number of tosses before the third T.

Solution: Pascal(3,0.2) c) X is the number of heads (H) in 5 tosses.

Solution: Binomial(5,0.8) d) After the occurrence of the first H, X is the number of extra tosses before the second H (inclusive).

Solution: Geometric(0.8)

138

D.4

2010 Quizzes: Solutions

Quiz Number 4

1. The CDF of a random variable is given as

FX (x) =

   02  

x 9

1

if x ≤ 0 if 0 < x < 3 if 3 ≤ x

a) [2 marks] Find P [1 < X < 2]. Solution: P [1 < X < 2] = FX (2) − FX (1) =

22 9



12 9

=

3 9

=

1 3

b) [2 marks] Find P [X > 1]. Solution: P [X > 1] = 1 − P [X ≤ 1] = 1 − FX (1) = 1 −

c) [3 marks] Find the pdf of X. {

Solution: fX (x) =

2x 9

0

0 8|T > 5] = P [T > 3] = e−λ×3 = e−0.6

(d) [1 mark] An EE has designed a circuit using two of above-mentioned transistors such that one is active and the second one is the spare (i.e., the second one becomes active when the first one dies). The circuit can last until both transistors are dead. What random variable can model the lifetime of this circuit? Give the parameters of its PDF. Solution: Erlang(2, 0.2) 2. For a Uniform(1,3)[ random variable ] 1 (a) [2 marks] Find E X 2 . Solution:

∫3

1 1 x2

× 12 dx =

−x−1 3 |1 2

=

1 2



1 6

=

1 3

(b) [2 marks] Find E[4X − 5]. Solution: E[4X − 5] = 4 × E[X] − 5 = 4 × 2 − 5 = 3

140

D.6

2010 Quizzes: Solutions

Quiz Number 6

1. X is a Gaussian random variable with µ = 8, σ 2 = 16. Answer the following (leave your answers in Φ(·) form with positive argument). (a) [1 mark] Find P [12 < X < 16]. Solution: P [12 < X < 16] = P [ 12−8 0]. Solution: P [X > 0] = 1 − P [X ≤ 0] = 1 − P [Z ≤

0−8 ] 4

= 1 − Φ(−2) = Φ(2)

(c) [2 marks] Define Y = X/4 + 6. Find, the average and variance of Y. Solution: E[Y ] = VAR[Y ] =

1 16

1 4

× E[X] + 6

× VAR[X]





E[Y ] = 8

VAR[Y ] = 1

(d) [1 mark] Define W = AX + B. Find A and B such that W is a Gaussian with mean zero and variance 4. Solution: VAR[W ] = A2 × VAR[X] = 4 E[W ] = A × E[X] + B = 0





A=

1 2

B = −4

2. Consider X ∼ Exponential(2) and Y = X 3 . (a) [1 mark] Find the range of Y. Solution: Since the range of X ⇒ X ≥ 0, the range of Y is Y ≥ 0

(b) [3 marks] Find the PDF of Y .

Solution: fY (y) = fY (y) = 23 y −2/3 e−2y

fX (x) , |g ′ (x)| x=g −1 (y) 1/3

for y ≥ 0

where g(x) = x3 , g ′ (x) = 3x2 and g −1 (y) = y 1/3 .

D.7 Quiz Number 7

D.7

141

Quiz Number 7

1. The joint pdf of two random variable is given as {

fX,Y (x, y) =

cx 0 ≤ y/2 ≤ x ≤ 1 0 elsewhere

[2 marks] Find c. ∫

1





2x

1

cx dydx =

Solution: 0

2cx2 dx = 1

0

0



2c =1 3



c=

3 2

[2 marks] Find the marginal PDF of X, fX (x). Be sure to consider the whole range −∞ < x < ∞. ∫  

2x

Solution: fX (x) =  0 0,

3 x dy = 3x2 , 0 < x < 1 2 otherwise.

[2 marks] Find the marginal PDF of Y , fY (y). Be sure to consider the whole range −∞ < y < ∞. ∫  

Solutions: fY (y) =  

1

y/2

0,

3 3 3y 2 x dx = − , 0 0]. Solution:

P[

X − 10 5 − 10 X − 10 > ] = 1 − P[ < −5] = 1 − Φ(−5) = 1 − (1 − Φ(5)) = Φ(5). 2 2 2

2. The time T in minutes between two successive bus arrivals in a bus stop is Exponential ( 0.2). (a) [ 1 mark] When you just arrive at the bus stop, what is the probability that you have to wait for more than 5 minutes? Solution: P [T > 5] = e−λ·5 = e−0.2·5 = e−1 .

(b) [1 mark] What is the average value of your waiting time? Solution:

E[T ] =

1 = 5. λ

158

2011 Quizzes: Solutions

(c) [ 1 mark] You are waiting for a bus, and no bus has arrived in the past 2 minutes. You decide to go to the adjacent coffee shop to grab a coffee. It takes you 5 minutes to grab your coffee and be back at the bus station. Determine the probability that you will not miss the bus. Solution: P [T > 7|T > 2] = P [T > 5] = e−1 . 3. [ 4 marks] You borrow your friend’s car to drive to Hinton to see your significant other. The driving distance is 100 km. The gas gauge is broken, so you don’t know how much gas is in the car. The tank holds 40 liters and the car gets 15 km per liter, so you decided to take a chance. (a) [2 marks] Suppose X is the distance (km) that you can drive until the car runs out of gas. Out of Uniform, Exponential and Gaussian PDFs, which one is most suitable for modeling X? Briefly justify your choice. Use your choice with the appropriate parameters to answer the following questions. Solution: First note that our random variable is limited and should have zero probability for values larger than 600. In addition there is no information about the value of available gas then every value between 0 and 600 should have the same probability then, X ∼ Uniform(0, 600).

(b) [ 1 mark] What is the probability that you make it to Hinton without running out of gas? Solution: P [X > 100] = 1 − P [X < 100] = 1 −

5 100 − 0 = . 600 6

(c) [1 mark] If you don’t run out of gas on the way, what is the probability that you will not run out of gas on the way back if you decide to a take chance again?

F.5 Quiz Number 6

159

Solution:

P [X > 200|X > 100] =

P [(X > 200) ∩ (X > 100)] P [(X > 200)] 4 = = . P [X > 100] P [X > 100] 5

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF