PRESTIGE ED
N1: Foundations of Bivariate Distributions
Node N1 — Section 1

Why This Concept Exists

Bivariate (joint) distributions are the single most tested topic in PTS2, appearing in every paper from 2013 to 2025 with an average of 18–21 marks per exam. Topic #2 in the leverage ranking (10/10) and the backbone of everything that follows.

The reason examiners return to this topic relentlessly is structural: bivariate distributions test whether you genuinely understand probability spaces in two dimensions. They force you to simultaneously reason about two random variables, their interaction, and the geometry of the region where they live. This is a far deeper test than univariate calculus — it requires you to understand what probability means in a multi-dimensional setting.

Leverage: N1-N3 (bivariate cluster) collectively account for 35-40% of total marks. These three nodes always appear as Q1 on the final exam, worth approximately 40-50 total marks. Getting bivariate distributions right is the single highest-ROI investment you can make in PTS2.

This node establishes the foundation: what a joint distribution is, how to read and construct joint PMFs and PDFs, how to extract marginal distributions, and how to understand the concept of support — the geometric region where the probability lives. Every subsequent topic in PTS2 builds directly on these fundamentals.


Node N1 — Section 2

Prerequisites

Before engaging with this node, you should be comfortable with the following concepts from PTS1 and first-year mathematics:

  • Univariate random variables: You should know the difference between discrete and continuous random variables, understand PMFs vs PDFs, and be able to compute \(E[X]\), \(\text{Var}(X)\) and the CDF for standard distributions (uniform, exponential, normal, Poisson, binomial).
  • Single-variable integration: Definite integrals, substitution, integration by parts, and partial fractions. You must be able to integrate polynomials, exponentials, and simple trigonometric functions without hesitation.
  • Basic set theory and probability axioms: Sample spaces, complements, unions, intersections, and the addition rule. Understanding that \(P(\Omega)=1\) and that probabilities are non-negative.
  • Cartesian products: The idea that a rectangle in \(\mathbb{R}^2\) can be written as \([a,b] \times [c,d]\), and that independence requires the support to be a Cartesian product of the marginal supports.
  • Partial sums: The ability to sum over one index while holding another fixed (summing rows and columns of a table).
Preparation checklist: If any of the above feels shaky, review before proceeding. Node N1 moves quickly through fundamentals and assumes you can compute \(E[X] = \sum x\cdot P(X=x)\) without needing to re-derive it.

Node N1 — Section 3

Core Exposition

3.1 What Is a Bivariate Distribution?

A bivariate distribution describes the joint behaviour of two random variables \(X\) and \(Y\) defined on the same probability space. Instead of asking "what is the probability that \(X\) takes a certain value?", we ask: "what is the probability that \(X\) takes value \(x\) and \(Y\) takes value \(y\)?"

The joint distribution is specified by its joint PMF (discrete case) or joint PDF (continuous case):

Discrete (PMF): \(f_{X,Y}(x,y) = P(X = x,\, Y = y)\) for all possible pairs \((x,y)\).
Continuous (PDF): \(f_{X,Y}(x,y) \geq 0\) and \(P\big((X,Y) \in A\big) = \displaystyle\iint_A f_{X,Y}(x,y)\,dy\,dx\) for any region \(A \subseteq \mathbb{R}^2\).

3.2 The Two Fundamental Properties

Every valid joint distribution must satisfy two properties:

Non-negativity: \(f_{X,Y}(x,y) \geq 0\) for all \((x,y)\).
Normalisation: \(\displaystyle\sum_x \sum_y f_{X,Y}(x,y) = 1\) (discrete)  or  \(\displaystyle\iint_{\text{all space}} f_{X,Y}(x,y)\,dy\,dx = 1\) (continuous).

The normalisation condition is the most frequently tested property. Examiners will often give you a function with an unknown constant \(c\) and ask you to "show that \(c = \ldots\)" by enforcing the normalisation integral.

3.3 Marginal Distributions

The marginal distribution of \(X\) is obtained by "summing/integrating out" \(Y\):

Discrete: \(f_X(x) = \displaystyle\sum_{y} f_{X,Y}(x,y)\)    (sum over all possible \(y\))
Continuous: \(f_X(x) = \displaystyle\int_{-\infty}^{\infty} f_{X,Y}(x,y)\,dy\)    (integrate over all possible \(y\))

Similarly for \(Y\): \(f_Y(y) = \sum_x f_{X,Y}(x,y)\) or \(f_Y(y) = \int f_{X,Y}(x,y)\,dx\).

3.4 The Support

The support is the set of all points \((x,y)\) where \(f_{X,Y}(x,y) > 0\). This is a geometric region in \(\mathbb{R}^2\) and it completely determines the integration limits in every calculation.

Critical insight: The support is not decorative. It is the most operationally important part of any joint PDF. If the support is non-rectangular (e.g., triangular or bounded by curves), the integration limits will be functions of the outer variable, not constants.

Two types of supports:

  • Rectangular (Cartesian product): The support is of the form \(a \leq x \leq b,\; c \leq y \leq d\). Integration limits are constants. Necessary (but not sufficient) for independence.
  • Non-rectangular: The support involves a relationship between \(x\) and \(y\), e.g., \(0 < y < x < 1\) (triangular) or \(0 < y < \sqrt{x} < 1\) (curved). Integration limits are functions. X and Y cannot be independent.

3.5 Expected Values and Moments

Expectations of functions of \((X,Y)\) are computed as:

\(E[g(X,Y)] = \displaystyle\sum_x \sum_y g(x,y)\,f_{X,Y}(x,y)\) (discrete)
\(E[g(X,Y)] = \displaystyle\iint g(x,y)\,f_{X,Y}(x,y)\,dy\,dx\) (continuous)

Key special cases: \(E[X]\), \(E[Y]\), \(E[X^2]\), \(E[Y^2]\), \(E[XY]\). The mixed moment \(E[XY]\) is crucial for covariance (covered in N2).

[INTERACTIVE: 2D support visualiser — will be added later]

Node N1 — Section 4

Worked Examples

Example 1: Discrete Bivariate PMF — Full Workthrough

Consider the joint PMF of \(X\) and \(Y\):

The joint PMF table
\[\begin{array}{c|cccc} & x=0 & x=1 & x=2 & x=4 \ \hline y=1 & 0.15 & 0.12 & 0.10 & 0.08 \ y=2 & 0.05 & 0.10 & 0.15 & 0.10 \ y=3 & 0.05 & 0.03 & 0.02 & 0.05 \end{array}\]
Verify this is a valid PMF: sum all entries = 1.00. \(\checkmark\)
Marginal of X: Sum each column.
\(P(X=0) = 0.15+0.05+0.05 = 0.25\), \(P(X=1) = 0.12+0.10+0.03 = 0.25\),
\(P(X=2) = 0.10+0.15+0.02 = 0.27\), \(P(X=4) = 0.08+0.10+0.05 = 0.23\).
Marginal of Y: Sum each row.
\(P(Y=1) = 0.45\), \(P(Y=2) = 0.40\), \(P(Y=3) = 0.15\).
Expectations \(E[X] = 0(0.25) + 1(0.25) + 2(0.27) + 4(0.23) = 0 + 0.25 + 0.54 + 0.92 = 1.71\)
\(E[Y] = 1(0.45) + 2(0.40) + 3(0.15) = 0.45 + 0.80 + 0.45 = 1.70\)
\(E[X^2] = 0 + 1(0.25) + 4(0.27) + 16(0.23) = 0.25 + 1.08 + 3.68 = 5.01\)
\(\text{Var}(X) = 5.01 - (1.71)^2 = 5.01 - 2.924 = 2.086\)
Probability of an event \(P(X + Y \leq 3)\) requires summing all pairs where \(x+y \leq 3\):
\((0,1): 0.15,\; (1,1): 0.12,\; (2,1): 0.10,\; (0,2): 0.05,\; (1,2): 0.10,\; (0,3): 0.05\)
Total: \(0.15+0.12+0.10+0.05+0.10+0.05 = 0.57\)

Example 2: Continuous Joint PDF — Finding the Constant

The joint PDF of \(X\) and \(Y\) is given by \(f_{X,Y}(x,y) = c \cdot x y\) for \(0 \leq x \leq 1\), \(0 \leq y \leq 1\), and 0 otherwise.

Find c by normalisation \[\int_0^1 \int_0^1 c\, xy \, dy\, dx = c \int_0^1 x \left[\frac{y^2}{2}\right]_0^1 dx = c\int_0^1 \frac{x}{2}\,dx = \frac{c}{2}\cdot\frac{1}{2} = \frac{c}{4}\] Set equal to 1: \(c/4 = 1\), so \(c = 4\).
Marginal of X \[f_X(x) = \int_0^1 4xy \, dy = 4x \cdot \frac{1}{2} = 2x, \quad 0 \leq x \leq 1\]
Marginal of Y (by symmetry) \[f_Y(y) = \int_0^1 4xy \, dx = 2y, \quad 0 \leq y \leq 1\]
Independence check \(f_X(x) \cdot f_Y(y) = 2x \cdot 2y = 4xy = f_{X,Y}(x,y)\), and the support \([0,1]\times[0,1]\) is a Cartesian product. \(\checkmark\)  X and Y are independent.

Node N1 — Section 5

Pattern Recognition & Examiner Traps

Trap 1: Forgetting the support when computing marginals The most common fatal error. When the support is non-rectangular, the integration limits are functions of the outer variable, not constants. Using constant limits gives the wrong marginal, and every subsequent calculation is wrong.
WRONG For \(f_{X,Y}(x,y) = 8xy\) on \(0 < x < y < 1\): \(f_X(x) = \displaystyle \int_0^1 8xy \,dy = 4x\) — limits [0,1] are constant.
RIGHT \(f_X(x) = \displaystyle \int_x^1 8xy \,dy = 4y^2x\big|_x^1 = 4x(1-x^2) = 4x - 4x^3\) for \(0 < x < 1\). The lower limit is \(x\), not 0.
Trap 2: Not checking that total probability = 1 If an examiner gives you a joint PMF/PDF with an unknown constant, the first thing to do is use the normalisation condition to find it. Students who skip this and proceed with an incorrect \(c\) lose all marks on the problem.
Trap 3: Confusing joint and marginal expectations \(E[XY]\) must be computed from the joint distribution: \(\iint xy\cdot f_{X,Y}(x,y)\,dy\,dx\). Students sometimes incorrectly write \(E[XY] = E[X]\cdot E[Y]\), which is only true under independence.
Examiner patterns to recognise:
  • "Show that \(c = \ldots\)" — immediately signals normalisation. Use \(\iint f = 1\).
  • "Find the marginal PDF of X" — integrate out Y. Always check if limits depend on x.
  • "Sketch the region where the joint density is positive" — this is the support. Sketch it first before any integration.
  • "Determine whether X and Y are independent" — check both functional factorisation AND Cartesian product support.

Node N1 — Section 6

Connections

N1 connects to every subsequent node in PTS2:
  • → N2 (Conditional & Covariance): Marginals from N1 are used to construct conditional distributions. The joint/ marginal framework is needed for covariance calculation.
  • → N3 (Non-Rectangular Supports): N1 introduces the concept of support; N3 deepens it to triangular and curved regions requiring careful double integration.
  • → N4 (Transformations): The support and joint PDF from N1 are the starting point for change-of-variable methods.
  • → N5 (Order Statistics): Requires understanding of joint distributions of the smallest/largest of a sample.
  • → N6-N12 (Inference): Everything from sampling distributions through two-sample tests builds on the probability machinery established here.

In the exam structure, N1 concepts always appear in Question 1 of the final — the first 8-10 marks of a 20-mark question. Getting N1 right means securing the "easy" half of the hardest question on the paper.


Node N1 — Section 7

Summary Table

ConceptDiscreteContinuousKey Trap
Joint distributionPMF: \(P(X=x,Y=y)\)PDF: \(f(x,y) \geq 0\)Must sum/integrate to 1
Marginal of X\(\sum_y f_{X,Y}(x,y)\)\(\int f_{X,Y}(x,y)\,dy\)Limits depend on support
Expectation\(\sum\sum g(x,y)f(x,y)\)\(\iint g(x,y)f(x,y)\,dy\,dx\)Use joint, not marginals
SupportSet of non-zero pairsRegion in \(\mathbb{R}^2\)Non-rectangular = dependent
Normalisation\(\sum\sum f = 1\)\(\iint f = 1\)How examiners test you
Independence\(f_{XY} = f_X \cdot f_Y\)Same + Cartesian supportMUST check both conditions

Node N1 — Section 8

Self-Assessment

Test your understanding of this node by working through these questions before moving to N2:

Checklist — Can you do all of these?
  • Given a joint PMF table, compute both marginals and verify they sum to 1.
  • Given a joint PDF with an unknown constant, find it via \(\iint f = 1\).
  • Given \(f_{X,Y}(x,y) = cxy\) on \([0,1]\times[0,1]\), find c, marginals, check independence.
  • Explain why non-rectangular support implies dependence.
  • Compute \(E[g(X,Y)]\) for a given \(g\).
  • Compute \(P(X+Y \leq k)\) by identifying the correct region in the support.
  • Sketch the support region from a verbal or algebraic description.
Practice problems to attempt independently
  • If \(f_{X,Y}(x,y) = c(x+y)\) on \(0 \leq x \leq 1\), \(0 \leq y \leq 2\), find c and the marginals.
  • If \(f_{X,Y}(x,y) = k\) on the triangle \(0 \leq x \leq y \leq 1\), find \(P(X+Y \leq 1)\).
  • True or false: If \(f_X(x)f_Y(y) = f_{X,Y}(x,y)\) everywhere, then X and Y are always independent. (Answer: False — support must also be Cartesian product.)
  • For \(f_{X,Y}(x,y) = 2\) on \(0 \leq x \leq y \leq 1\), compute the marginal of X. [Answer: \(f_X(x) = 2(1-x)\) for \(0 < x < 1\).]

High-Leverage Questions

HLQ: Exam-Style Question with Worked Solution

12 MARKS FINAL 2023 Q1(a-b) COMPUTE

The joint PMF of discrete random variables \(X\) and \(Y\) is given by the table:

\[\begin{array}{c|ccc} f_{X,Y}(x,y) & x=1 & x=2 & x=3 \ \hline y=0 & \frac{1}{12} & \frac{1}{6} & \frac{1}{12} \ y=1 & \frac{1}{6} & \frac{1}{12} & \frac{1}{4} \ y=2 & 0 & \frac{1}{12} & \frac{1}{12} \end{array}\]

(a) Find the marginal PMFs of \(X\) and \(Y\). (4 marks)

(b) Compute \(E[X]\) and \(E[Y]\). (2 marks)

(c) Compute \(E[XY]\) and hence \(\text{Cov}(X,Y)\). (4 marks)

(d) Are X and Y independent? Justify your answer. (2 marks)


Part (a): Marginal PMFs Marginal of X (sum columns):
\(P(X=1) = \frac{1}{12} + \frac{1}{6} + 0 = \frac{3}{12} = \frac{1}{4}\)
\(P(X=2) = \frac{1}{6} + \frac{1}{12} + \frac{1}{12} = \frac{4}{12} = \frac{1}{3}\)
\(P(X=3) = \frac{1}{12} + \frac{1}{4} + \frac{1}{12} = \frac{5}{12}\)
Check: \(\frac{3}{12} + \frac{4}{12} + \frac{5}{12} = 1\) \(\checkmark\)

Marginal of Y (sum rows):
\(P(Y=0) = \frac{1}{12} + \frac{1}{6} + \frac{1}{12} = \frac{4}{12} = \frac{1}{3}\)
\(P(Y=1) = \frac{1}{6} + \frac{1}{12} + \frac{1}{4} = \frac{6}{12} = \frac{1}{2}\)
\(P(Y=2) = 0 + \frac{1}{12} + \frac{1}{12} = \frac{2}{12} = \frac{1}{6}\)
Check: \(\frac{1}{3} + \frac{1}{2} + \frac{1}{6} = 1\) \(\checkmark\)
Part (b): Expectations \(E[X] = 1\cdot\frac{1}{4} + 2\cdot\frac{1}{3} + 3\cdot\frac{5}{12} = \frac{3}{12} + \frac{8}{12} + \frac{15}{12} = \frac{26}{12} = \frac{13}{6} \approx 2.167\)
\(E[Y] = 0\cdot\frac{1}{3} + 1\cdot\frac{1}{2} + 2\cdot\frac{1}{6} = 0 + \frac{1}{2} + \frac{2}{6} = \frac{5}{6} \approx 0.833\)
Part (c): E[XY] and Covariance \(E[XY] = \sum_{x,y} xy\cdot P(X=x,Y=y)\)
Summing non-zero terms only (skip where x=0 or y=0):
\(= 1\cdot 1\cdot\frac{1}{6} + 2\cdot 1\cdot\frac{1}{12} + 3\cdot 1\cdot\frac{1}{4} + 2\cdot 2\cdot\frac{1}{12} + 3\cdot 2\cdot\frac{1}{12}\)
\(= \frac{1}{6} + \frac{2}{12} + \frac{3}{4} + \frac{4}{12} + \frac{6}{12}\)
\(= \frac{2}{12} + \frac{2}{12} + \frac{9}{12} + \frac{4}{12} + \frac{6}{12} = \frac{23}{12}\)

\(\text{Cov}(X,Y) = E[XY] - E[X]E[Y] = \frac{23}{12} - \frac{13}{6}\cdot\frac{5}{6} = \frac{23}{12} - \frac{65}{36} = \frac{69-65}{36} = \frac{4}{36} = \frac{1}{9}\)
Part (d): Independence Test: Does \(f_{X,Y}(x,y) = f_X(x)\cdot f_Y(y)\) for ALL \((x,y)\)?
For \((0,2)\): \(f_{X,Y}(0,2) = 0\), but \(f_X(1)\cdot f_Y(2) = \frac{1}{4}\cdot\frac{1}{6} = \frac{1}{24} \neq 0\).
X and Y are not independent.
Summary of answers: (a) See marginals above. (b) \(E[X] = 13/6\), \(E[Y] = 5/6\). (c) \(E[XY] = 23/12\), \(\text{Cov}(X,Y) = 1/9\). (d) Not independent — counterexample at \((0,2)\).