Continuous Random Variable
2026-02-28 19:18 Diff

To understand the probability distributions and conduct statistical analysis, we need to comprehend the various properties of a continuous random variable. Several important properties distinguish the continuous random variable from the discrete random variable. They include: 
 

Probability Density Function

A probability density function f(x) defines a continuous random variable x. The relative probability for x to be a certain value x is represented by the PDF f(x). The PDF needs to meet two conditions. The first one is f(x) ≥ 0 for all  x. It means the probability density function (PDF) must always be non-negative but can be zero for some values. The next condition is that the total probability over all possible values of x must be 1, meaning the sum of all probability values must be 100% or 1. This is written as:
  

\(\int_{-\infty}^{\infty} f(x)\, dx = 1 \)

Cumulative Distribution Function (CDF)

The cumulative distribution function (CDF), represented as F(x), shows the likelihood that a continuous random variable X will take a value that is less than or equal to x: 

\(F(x) = P(X \leq x) = \int_{-\infty}^{x} f(t)\,dt \)

The CDF is non-decreasing and continuous. The CDF approaches 0 as x moves closer to negative infinity.

lim ​F(x) = 0
x→∞

Likewise, the CDF approaches 1 as x moves closer to positive infinity.  

lim ​F(x) = 1
x→∞ 

Moment Generating Function (MGF)

The moment generating function (MGF) of a continuous random variable X is represented as Mx (t). It is defined as:


\(M_X(t) = E\!\left(e^{tX}\right) = \int_{-\infty}^{\infty} e^{tx} f(x)\,dx \)

The MGF can be used to find all the moments of X, including its mean and variance if it exists.

Characteristic Function

The characteristic function of a continuous random variable X is expressed as ϕX(t). It is the Fourier transform of the probability density function (PDF). 


\(\varphi_X(t) = E\!\left(e^{itX}\right) = \int_{-\infty}^{\infty} e^{itx} f(x)\,dx\)

It determines the distribution of X uniquely.

Mean and Variance of Continuous Random Variable

A continuous random variable X with PDF f(x) has the following expectation or mean:

\(E(X) = \int_{-\infty}^{\infty} x f(x)\,dx \)

The formula represents the expected value of X. 

The variance of X measures the average of the squared deviations of the random variable from the mean. It is defined as:

\(\operatorname{Var}(X) = E\!\left[(X - E(X))^{2}\right] = \int_{-\infty}^{\infty} (x - \mu)^{2} f(x)\,dx \)

Here, μ = E (X) is the mean. 

Where X's second moment is represented by E(X2) and it is denoted as:

\(E(X^2) = \int_{-\infty}^{\infty} x^2 f(x)\,dx \)