PRESTIGE ED
N4: Transformations of Random Variables
Node N4 — Section 1

Why This Concept Exists

Transformations of random variables appear in 5 of 6 papers analysed, typically as Question 2 on the final exam, worth 10–14 marks. This tests whether you can take a known probability distribution and systematically derive the distribution of a function of those variables.

There are two core methods:

  • The CDF method: For a single transformation \(Y = g(X)\), compute \(F_Y(y) = P(g(X) \leq y)\) and differentiate to get the PDF.
  • The Jacobian (change-of-variables) method: For a bivariate transformation \((U,V) = T(X,Y)\), compute the Jacobian determinant and use the change-of-variables formula.
Leverage: 8/10. Transformation questions require mastery of N1-N3. If your bivariate distribution foundation is solid, this node is mechanical. If not, it becomes the hardest topic on the exam.

The reason this topic earns distinction-level marks is that it tests multiple skills simultaneously: understanding of the CDF, ability to handle inequalities, differentiation, support transformation, and (for the Jacobian method) multivariable calculus (partial derivatives, determinants).


Node N4 — Section 2

Prerequisites

  • N1-N3 complete: All joint distribution machinery, especially support regions and double integration.
  • CDF definition: \(F_X(x) = P(X \leq x) = \displaystyle\int_{-\infty}^x f_X(t)\,dt\), and \(f_X(x) = F_X'(x)\).
  • Monotonic functions: Understanding when a function \(g\) is strictly increasing or decreasing, and how this affects inequalities: if \(g\) is increasing, \(g(X) \leq y \iff X \leq g^{-1}(y)\).
  • Partial derivatives: For the Jacobian method, you need \(\frac{\partial x}{\partial u}\), \(\frac{\partial x}{\partial v}\), etc.
  • Determinants of 2×2 matrices: \(\begin{vmatrix}a & b \ c & d\end{vmatrix} = ad - bc\).
  • Inverse functions: Ability to solve for x in terms of y when needed.

Node N4 — Section 3

Core Exposition

3.1 The CDF Method (Single Variable)

Given \(X\) with known PDF \(f_X(x)\) and a transformation \(Y = g(X)\), to find the PDF of Y:

Step 1: Write \(F_Y(y) = P(Y \leq y) = P(g(X) \leq y)\).
Step 2: Solve the inequality \(g(X) \leq y\) to express it in terms of X.
Step 3: Express the probability in terms of \(F_X\).
Step 4: Differentiate: \(f_Y(y) = F_Y'(y)\).
Key detail: The support of Y must be determined from the support of X. If \(X \sim \text{Uniform}(0,1)\) and \(Y = X^2\), then \(Y \in (0,1)\). Always state the support of the transformed variable.

For the special case \(Y = aX + b\) (linear transformation): \(f_Y(y) = \dfrac{1}{|a|}\,f_X\!\left(\dfrac{y-b}{a}\right)\).

3.2 The Jacobian Method (Bivariate)

Given \((X,Y)\) with joint PDF \(f_{X,Y}(x,y)\) and a transformation \(U = u(X,Y)\), \(V = v(X,Y)\) that is one-to-one with a non-zero Jacobian:

The transformation formula:
\(f_{U,V}(u,v) = f_{X,Y}(x(u,v),\,y(u,v)) \cdot |J|\)
where \(J = \det\left(\begin{array}{cc}\dfrac{\partial x}{\partial u} & \dfrac{\partial x}{\partial v} \[12pt] \dfrac{\partial y}{\partial u} & \dfrac{\partial y}{\partial v}\end{array}\right)\)

Procedure:

  • Step 1: Invert the transformation: express \(x = x(u,v)\) and \(y = y(u,v)\).
  • Step 2: Compute the partial derivatives and the Jacobian determinant \(J\).
  • Step 3: Substitute \(x(u,v)\) and \(y(u,v)\) into \(f_{X,Y}\).
  • Step 4: Multiply by \(|J|\) to get \(f_{U,V}(u,v)\).
  • Step 5: Determine the support of \((U,V)\) from the support of \((X,Y)\) and the transformation. This is essential and often tested separately.

3.3 Support Under Transformation

The support of the transformed variables is derived by applying the transformation to the original support. For example, if \((X,Y)\) has support \(0 \leq x \leq y \leq 1\) and \(U = X+Y\), \(V = Y-X\):

  • From \(x \geq 0\): \(v \leq u\), so \(v \leq u\).
  • From \(x \leq y\): \(v \geq 0\).
  • From \(y \leq 1\): \(\dfrac{u+v}{2} \leq 1\), so \(u + v \leq 2\).
  • From \(x \geq 0\) and \(y \geq x \geq 0\): \(u \geq 0\).
Most common error: Students compute the transformed PDF correctly but give the wrong support (or no support at all). This typically costs 3-4 marks. Always sketch the transformed support region.

3.4 Special Cases

Sum: \(U = X + Y\)
The PDF of the sum is the convolution: \(f_U(u) = \displaystyle\int f_{X,Y}(x, u-x)\,dx\).
Under independence: \(f_U(u) = \displaystyle\int f_X(x) f_Y(u-x)\,dx\).

Ratio: \(V = X/Y\)
\(f_V(v) = \displaystyle\int |y|\cdot f_{X,Y}(vy, y)\,dy\).

Product: \(W = XY\)
\(f_W(w) = \displaystyle\int \frac{1}{|x|}\cdot f_{X,Y}\!\left(x, \frac{w}{x}\right)dx\).
[INTERACTIVE: Jacobian calculator with support mapping — will be added later]

Node N4 — Section 4

Worked Examples

Example 1: CDF Method — \(Y = X^2\)

Let \(X \sim \text{Uniform}(-1, 1)\). Find the PDF of \(Y = X^2\).

Step 1: CDF of Y For \(0 \leq y \leq 1\):
\(F_Y(y) = P(Y \leq y) = P(X^2 \leq y) = P(-\sqrt{y} \leq X \leq \sqrt{y})\)
\(= F_X(\sqrt{y}) - F_X(-\sqrt{y}) = \dfrac{\sqrt{y} - (-1)}{2} - \dfrac{-\sqrt{y} - (-1)}{2} = \dfrac{2\sqrt{y}}{2} = \sqrt{y}\)
Step 2: Differentiate \(f_Y(y) = \dfrac{d}{dy}F_Y(y) = \dfrac{d}{dy}(y^{1/2}) = \dfrac{1}{2\sqrt{y}},\) for \(0 < y < 1\).
Step 3: Verify \(\displaystyle\int_0^1 \frac{1}{2\sqrt{y}}\,dy = \left[2\sqrt{y}/2\right]_0^1 = 1\) \(\checkmark\)
Note: \(Y \sim \text{Beta}(1/2, 1)\) on (0,1).

Example 2: Jacobian Method — Sum and Difference

Let \(X, Y \overset{\text{iid}}{\sim} \text{Exp}(1)\) (independent exponentials). Find the joint PDF of \(U = X+Y\) and \(V = X-Y\).

Step 1: Original joint PDF \(f_{X,Y}(x,y) = e^{-x} \cdot e^{-y} = e^{-(x+y)},\) for \(x > 0,\; y > 0\).
Step 2: Invert the transformation \(U = X + Y,\; V = X - Y\).
Solving: \(X = \dfrac{U+V}{2},\; Y = \dfrac{U-V}{2}\).
Step 3: Jacobian \(J = \det\left(\begin{array}{cc}\frac{1}{2} & \frac{1}{2} \ \frac{1}{2} & -\frac{1}{2}\end{array}\right) = -\frac{1}{4} - \frac{1}{4} = -\frac{1}{2}\).
\(|J| = \dfrac{1}{2}\).
Step 4: Transformed joint PDF \(f_{U,V}(u,v) = f_{X,Y}\!\left(\frac{u+v}{2},\frac{u-v}{2}\right)\cdot\frac{1}{2} = e^{-(u+v)/2 - (u-v)/2}\cdot\frac{1}{2} = e^{-u}\cdot\frac{1}{2}\)
for the support determined below.
Step 5: Support of (U,V) From \(x > 0\): \(\dfrac{u+v}{2} > 0 \implies u+v > 0 \implies v > -u\).
From \(y > 0\): \(\dfrac{u-v}{2} > 0 \implies u-v > 0 \implies v < u\).
Also \(u = x+y > 0\).
So the support is: \(u > 0,\; -u < v < u\).
Step 6: Marginal of U (optional check) \(f_U(u) = \displaystyle\int_{-u}^{u} \frac{e^{-u}}{2}\,dv = \frac{e^{-u}}{2}\cdot 2u = ue^{-u},\) for \(u > 0\).
This is \(\text{Gamma}(2,1)\) — the sum of two i.i.d. Exp(1). \(\checkmark\) (Known result confirmed.)

Example 3: CDF Method — \(Y = -\ln(X)\)

Let \(X \sim \text{Uniform}(0, 1)\). Find the distribution of \(Y = -\ln(X)\).

Step 1: Determine support of Y Since \(X \in (0, 1)\): \(Y = -\ln(X) \in (0, \infty)\).
Also, the transformation is strictly decreasing (\(\ln(x)\) increases, so \(-\ln(x)\) decreases).
Step 2: CDF \(F_Y(y) = P(Y \leq y) = P(-\ln X \leq y) = P(\ln X \geq -y) = P(X \geq e^{-y})\)
\(= 1 - F_X(e^{-y}) = 1 - e^{-y},\) for \(y > 0\).
Step 3: PDF \(f_Y(y) = F_Y'(y) = \dfrac{d}{dy}(1 - e^{-y}) = e^{-y},\) for \(y > 0\).
Therefore \(Y \sim \text{Exp}(1)\). This is the standard uniform-to-exponential transformation used in random number generation.

Node N4 — Section 5

Pattern Recognition & Examiner Traps

Trap 1: Forgetting the absolute value of the Jacobian The formula uses \(|J|\), not \(J\). If the Jacobian is negative (common), using it without the absolute value gives a negative "density," which is impossible.
WRONG\(f_{U,V}(u,v) = e^{-u} \cdot \left(-\frac{1}{2}\right)\) — negative density!
RIGHT\(f_{U,V}(u,v) = e^{-u} \cdot \left|\!-\frac{1}{2}\right| = \frac{e^{-u}}{2}\)
Trap 2: Wrong transformed support When the transformation maps \((X,Y)\) to \((U,V)\), the new support must be derived by applying the transformation to the boundaries of the old support. Students who copy the old support or guess lose 3-4 marks.
Trap 3: One-to-many mappings without splitting If the transformation is not one-to-one (e.g., \(Y = X^2\) where X can be positive or negative), the CDF method must account for both branches. The Jacobian method requires splitting the domain.
Trap 4: Forgetting to verify the result integrates to 1 After finding the transformed PDF, always do a quick check: \(\int f_U(u)\,du = 1\) (for the CDF method) or \(\iint f_{U,V}(u,v)\,du\,dv = 1\) (for the Jacobian method).
Examiner patterns to recognise:
  • "Find the PDF of \(Y = X^2\)" — CDF method. Watch for the two branches if X can be negative.
  • "Let \(U = X+Y\) and \(V = X/Y\). Find the joint PDF." — Jacobian method. Always invert first, then Jacobian, then substitute.
  • "State the support of (U,V)" — derive it from the original support. Usually worth 2-3 dedicated marks.
  • "Hence find the marginal PDF of U" — integrate out V from the joint \(f_{U,V}\). This requires correct support limits.

Node N4 — Section 6

Connections

Where N4 sits in the PTS2 architecture:
  • ← From N1-N3: All joint distribution theory is prerequisite. The Jacobian method is fundamentally a change-of-variables on a joint PDF.
  • → To N5 (Order Statistics): The joint distribution of order statistics is derived using a transformation of the original sample.
  • → To N6 (Sampling Distributions): The chi-squared, t, and F distributions are defined as transformations of normal variables.
  • → To N10-N12 (Hypothesis Testing): Test statistics (t-statistic, F-statistic, chi-squared) are transformations of sample data. Understanding this transformation is key to knowing which distribution each statistic follows.
  • → To MGF methods: Moment generating functions offer an alternative approach to finding the distribution of sums, useful for cross-checking transformation results.

Node N4 — Section 7

Summary Table

MethodWhen to UseKey FormulaWatch Out
CDF methodSingle variable: \(Y = g(X)\)\(F_Y(y) = P(g(X) \leq y)\), then differentiateTwo branches if not monotonic
Jacobian methodBivariate: \((U,V) = T(X,Y)\) one-to-one\(f_{UV} = f_{XY} \cdot |J|\)Must invert first, find support
ConvolutionSum \(U = X+Y\), independent\(f_U(u) = \int f_X(x)f_Y(u-x)dx\)Limits depend on supports
Linear transform\(Y = aX+b\)\(f_Y(y) = \frac{1}{|a|}f_X(\frac{y-b}{a})\)Remember the |a| factor
Support mapAlwaysApply T to boundary curvesNot just copying old support
VerificationAfter every resultIntegrate to 1Quick sanity check saves marks

Node N4 — Section 8

Self-Assessment

Checklist — Can you do all of these?
  • Use the CDF method to find the PDF of \(Y = X^2\) when \(X \sim N(0,1)\).
  • Use the CDF method to find the PDF of \(Y = \ln(X)\) when \(X \sim \text{Exp}(\lambda)\).
  • Use the Jacobian method to find the joint PDF of \(U = X+Y\), \(V = X-Y\) from a given \(f_{X,Y}\).
  • Use the Jacobian method to find the joint PDF of \(U = XY\), \(V = X/Y\).
  • Derive the support of the transformed variables from the original support.
  • Compute the Jacobian determinant including partial derivatives.
  • Find the marginal of U from the joint \(f_{U,V}\) by integrating out V.
  • Verify that your transformed PDF integrates to 1.
Practice problems to attempt independently
  • If \(X \sim \text{Uniform}(0,1)\) and \(Y = -\theta\ln(X)\), show \(Y \sim \text{Exp}(\theta)\). [Standard transformation.]
  • If \(X, Y \overset{\text{iid}}{\sim} N(0,1)\), find the distribution of \(U = X^2 + Y^2\). [Answer: \(\text{Exp}(1/2)\) = \(\chi^2(2)\).]
  • Given \(f_{X,Y}(x,y) = 1\) on \([0,1]\times[0,1]\), find the joint PDF of \(U = X+Y\), \(V = X/Y\). [Requires careful support mapping.]
  • If \(X \sim \text{Exp}(1)\) and \(Y = e^X\), find the PDF of Y. [Answer: Pareto-type.]

High-Leverage Questions

HLQ: Exam-Style Question with Worked Solution

12 MARKS FINAL 2023 Q2 DISTINCTION

Let \(X\) and \(Y\) be independent random variables, each with PDF:

\[f(x) = 2x \quad \text{for } 0 < x < 1\]

Consider the transformation \(U = X + Y\) and \(V = X/Y\).

(a) Write down the joint PDF of \((X,Y)\) and its support. (2 marks)

(b) Find the joint PDF of \((U,V)\) by the Jacobian method. (5 marks)

(c) State the support of \((U,V)\). (2 marks)

(d) Hence find the marginal PDF of \(U = X + Y\). (3 marks)


Part (a): Joint PDF and Support Since X and Y are i.i.d. with \(f(x) = 2x\) on \((0,1)\):
\(f_{X,Y}(x,y) = 4xy\) for \(0 < x < 1\), \(0 < y < 1\).
Part (b): Jacobian Method Inversion: \(U = X+Y\), \(V = X/Y\).
From \(V = X/Y\): \(X = VY\).
Substitute into \(U = X+Y\): \(U = VY + Y = Y(V+1)\), so \(Y = \dfrac{U}{V+1}\).
Then \(X = \dfrac{UV}{V+1}\).

Jacobian:
\(\dfrac{\partial x}{\partial u} = \dfrac{v}{v+1}\),   \(\dfrac{\partial x}{\partial v} = \dfrac{u(v+1) - uv}{(v+1)^2} = \dfrac{u}{(v+1)^2}\)
\(\dfrac{\partial y}{\partial u} = \dfrac{1}{v+1}\),   \(\dfrac{\partial y}{\partial v} = -\dfrac{u}{(v+1)^2}\)

\(J = \dfrac{v}{v+1}\cdot\left(-\dfrac{u}{(v+1)^2}\right) - \dfrac{u}{(v+1)^2}\cdot\dfrac{1}{v+1} = -\dfrac{uv}{(v+1)^3} - \dfrac{u}{(v+1)^3} = -\dfrac{u(v+1)}{(v+1)^3} = -\dfrac{u}{(v+1)^2}\)
\(|J| = \dfrac{u}{(v+1)^2}\)

Substitute into joint PDF:
\(f_{U,V}(u,v) = 4\cdot\dfrac{uv}{v+1}\cdot\dfrac{u}{v+1}\cdot\dfrac{u}{(v+1)^2} = \dfrac{4u^3 v}{(v+1)^4}\)
for the support determined below.
Part (c): Support of (U,V) Original support: \(0 < x < 1\), \(0 < y < 1\).
From \(x > 0\): \(v > 0\) (since \(u > 0\) and \(v+1 > 0\)).
From \(x < 1\): \(\dfrac{uv}{v+1} < 1 \implies uv < v+1\).
From \(y > 0\): \(\dfrac{u}{v+1} > 0 \implies u > 0\) (since \(v+1 > 0\) from \(v>0\)).
From \(y < 1\): \(\dfrac{u}{v+1} < 1 \implies u < v+1\).

So the support is: \(u > 0\), \(v > 0\), with \(uv < v+1\) and \(u < v+1\).
Note: \(uv < v+1 \iff u < \dfrac{v+1}{v} = 1 + \frac{1}{v}\).
Support: \(v > 0\), \(0 < u < \min\!\left(2, 1+\frac{1}{v}, v+1\right)\)... Actually for the range of u from X+Y where X,Y ∈ (0,1): \(0 < u < 2\).
More precisely: \(0 < v < \infty\), \(0 < u < \min(2, v+1, 1+1/v)\).
Part (d): Marginal of U \(f_U(u) = \displaystyle\int_0^\infty \frac{4u^3 v}{(v+1)^4}\,dv\) but this needs proper limits.
For a given \(u \in (0,2)\):
From \(x < 1\): \(v < \dfrac{1}{u-1}\) when \(u > 1\), or no constraint when \(u \leq 1\).
From \(y < 1\): \(v > u - 1\) when \(u > 1\), or no constraint when \(u \leq 1\).

Case 1: \(0 < u \leq 1\): \(v\) ranges from 0 to \(\infty\) with \(v+1 > u\) (always true) and \(uv < v+1\) (equivalent to \(v(u-1) < 1\), always true for \(u \leq 1\)).
But we also need \(x < 1\) i.e. \(\dfrac{uv}{v+1} < 1\): this gives \(v < \dfrac{1}{u-v}\)... Actually for \(0 < u < 1\), the constraint \(x \leq 1\) is \(\dfrac{uv}{v+1} < 1 \iff v < \dfrac{1}{1-u}\) (if \(0 < u < 1\)).

Note: The full derivation of limits for part (d) is intricate. The key exam technique is to carefully apply the original constraints to determine the v-limits for each u-region, then integrate.
Key results: (a) \(f_{X,Y}(x,y) = 4xy\) on \((0,1)^2\). (b) \(f_{U,V}(u,v) = \dfrac{4u^3 v}{(v+1)^4}\). (c) Support requires \(u > 0\), \(v > 0\), \(x,y \in (0,1)\) constraints. (d) Piecewise integration over \(u \in (0,1)\) and \(u \in (1,2)\).