JOINT DENSITY FUNCTIONS IN STATISTICS

SHARE

This is a continuation of our previous post on Random Variables. If you missed it, go back and check it out.

A joint density function such as described above can be used to calculate probabilities about X and Y by summing f(x,y) over the appropriate values of x and y just as in the one-dimensional case.

It is important to note that the new sample space consists of points in the (x,y) plane, and the probabilities are given by f(x,y): f(x,y) > 0.

Joint Continuous Density Functions

As for discrete random variables, a density function for two or more continuous random variables is a generalization of a density function for one variable.

Thus, a density function for the continuous random variables X and Y is denoted by f(x,y).

Marginal and Conditional Distributions

Consider an experiment in which A is the event that a random variable assumes the values x and B is another event in which the random variable Y assumes the value y. By the application of the multiplication rule; i.e

P(A∩B) = P(A) P(B|A). It is easy to see that f(x,y) = f(x) f(y|x).

Since f(y|x) is the conditional probability that Y assumes the value y given that X has a known fixed value x. then the sum of f (y|x) with all possible values of y is equal to x. then the sum of f(y|x) gives possible values of y is equal to 1.

∑ f(x,y) = ∑f(x) f(y|x)

= f(x) ∑ f (y|f)
= f(x)
Therefore the marginal distribtion of X is given by f(x) = ∑ f(x,y)
Similarly, the marginal distribution of Y is given by f(y) = ∑ f(x.y)
This means that if the joint density of two random variables is known, the marginal density function of two random variables is known, the marginal distribution of any one of them is obtained summing the joint density function over the values of the variable.

This result can be extended conveniently to the marginal distribution of continuous random variables.

NO COMMENTS

Let's hear your thoughts