Probability Theory and Statistics
- Description
- Curriculum
- FAQ
- Reviews
Unlock the power of probability and statistics to make data-driven decisions in economics and the social sciences. This rigorous course provides a strong mathematical foundation in probability theory, random variables, distributions, moments, and statistical inference in over 60 quality lectures, each including a download PDF resource.
Designed for students, researchers, and professionals, the course covers essential topics such as conditional probability, joint distributions, covariance, estimators, and large-sample properties—all with applications relevant to economics, political science, and policy analysis.
With clear explanations and step-by-step problem-solving, this course is perfect for those with a basic background in calculus and linear algebra but little to no prior exposure to probability and statistics.
What you’ll learn:
Probability
The purpose is to give a rigorous introduction to probability. In the first section we look at the foundations, defining experiment, outcome and events. We then move on to the main section on probability and conditional probability. The chapter is concluded with a short section defining independent events.
Random variables
Random variable is the probably the most important concept in probability theory. Based on the setup from topic one, we give a somewhat simplified definition of a random variable. A given random variables will have associated distribution functions which will help us calculate probabilities and these will be analyzed in the second section. We then look at the most important random variable, the standard normal.
Moments
A given random variables will have associated moments which can be calculated from its distribution functions and we look at moments in the first section. Once we have a random variable X, we can create a new one Y by a composition, Y = g(X) for some function and this is the topic of section 2.
Several random variables
Our setup in probability theory is always an experiment, a collection of possible outcome, a bunch of events and probabilities assigned to each of these events. On top of this we have so far defined a single random variable. What we will do now is to define several random variables on top of the same experiment. Defining several random variables is no more mysterious than defining several functions on the same domain. They will simply map a given outcome to different numbers. More on this in the first section. The second section then introduces joint distribution functions for these random variables. We then introduce conditional distribution functions in three which will simply calculations of conditional probabilities introduced in topic 1. Given several random variables, we can create a new one by taking a function of them. Section 4 looks at this as well as how to find moments of a random variable defined like this. Several random variables will also have associated joint moments and section five looks at the most important joint moment, the covariance. Section six and seven look at another type of moments, namely conditional moments.
Random vectors
In the previous topic, we considered the case of defining several random variables although most of it was done using only two random variables. If we want to look at more than two random variables, then it is much more useful to define a k×1 random vector containing k random variables. We will see that many expressions, such as a linear function of several random variables can be expressed more compactly using vector in matrix notation.
Statistics
The final topic of this course is an introduction to statistics. The basic idea of statistics is to make inference about a population given a sample drawn from this population. We begin by translating the population/sample concepts into a framework consistent with formal probability theory in section one. Section two looks at some important distributions that we often end up using in statistics. Section three looks at the simplest problems in statistics, making inference about the mean and the variance. We then move on to a more general study of statistics, introducing estimators and their small-sample properties. The following section is devoted to the more important, but also more difficult, large-sample properties of estimators. The final section will generalize what we have done so far in this topic by looking at several estimators collected in a vector.
Start building your analytical skills today and gain the statistical expertise needed for research, policy analysis, and data-driven decision-making!
-
1Experiment and OutcomeVideo lesson
In this section, we introduce the fundamental concepts of probability theory. In probability theory, we always imagine that there is an experiment in the background that will result in precisely one out of a given set of possible outcomes. An event is then defined as a subset of the set of all possible outcomes. We define what we mean by mutually exclusive events (they are disjoint) and describe how new events can be created from existing ones using unions, intersections and complements.
-
2EventVideo lesson
-
3Mutually exclusive eventsVideo lesson
-
4Combining eventsVideo lesson
-
5Probability of eventsVideo lesson
Probabilities and conditional probabilities
Once we have an experiment and events, we can assign probabilities to each of these events. These must be assigned in a consistent way resulting in several probability rules. We also look at conditional probabilities, the probability of an event when we know that another event is true.
-
6Probability rulesVideo lesson
-
7Conditional probabilityVideo lesson
-
9Random variableVideo lesson
Random variables
Given an experiment with possible outcomes, a collection of events and probabilities assigned to each of these events, we are in a position to define a random variable. It is difficult to give a formally correct definition of a random variable without spending some time on measure theory which we want to avoid in this course. In this first section we will look at a simplified definition which is close to the complete one. A random variable is defined as a function, a function which will map each outcome to a real number. In order for such a function to be a random variable it must actually satisfy a condition called measurability, which we will not discuss in this course. Once we have defined a random variable, we can look at probabilities that this random variable will take certain values.
-
10Random variable and probabilitiesVideo lesson
-
11CDF, cumulative distribution functionVideo lesson
Distribution functions
For a given random variable X, we define its cumulative distribution function (CDF) as the probability that X will take a value smaller than or equal to a given real number x. We will look at two types of random variables in this course: discrete random variables and continuous random variables. All discrete random variables have an associated probability mass function (PMF) and we will look at the relationship between the CDF and the PMF. All continuous random variables have a probability density function (PDF) and we will also look at the relationship between the CDF and the PDF for such random variables.
-
12Discrete random variable and probability mass functionVideo lesson
-
13Discrete random variable: probability mass function versus CDFVideo lesson
-
14Continuous random variable and the probability density functionVideo lesson
-
15The PDF and the cdf of a continuous random variableVideo lesson
-
17Expected value of a discrete random variableVideo lesson
A given random variables will have associated moments which can be calculated from its distribution functions and we look at moments in the first section. Once we have a random variable X, we can create a new one Y by a composition, Y = g(X) for some function and this is the topic ahead. We conclude this chapter with the family of random variables called the normal random variables.
Expected value and variance
We have talked about discrete and continuous random variables and we have talked about distribution functions, from which we can calculate probabilities such as the probability that the random variable will take a value in a certain interval. In this section, we introduce moments of a random variable. A given random variable will have several different moments. The most important moments of a random variable is called the expected value. In this section, we will look at the definition of the expected value of a random variable, how it is calculated from the PDF or the PMF and how it is interpreted. We also look at the second most important moment of a random variable, the variance.
-
18Expected value of a continuous random variableVideo lesson
-
19The variance of a random variableVideo lesson
-
20Function of a random variableVideo lesson
Function of a random variable and its moments
A common situation in probability theory is that we start with a given random variable X and we then define a new random variable Y as a function of the initial random variable X. It is in general possible to find the PDF or the PMF for Y if we know the PDF/PMF of X but this is typically a difficult problem. It turns out that we can often quite easily find the expected value of Y from the PDF/PMF of X without knowing the PDF/PMF of Y and we will look at precisely how this is done in this section. Finding the expected value and the variance of Y when Y is a linear function of X turns out to be particularly simple.
-
21Expected value of a function of a random variableVideo lesson
-
22The expected value and variance of a linear function of a random variableVideo lesson
-
23The normal distributionVideo lesson
The normal random variables
In this final section we looked at an entire family of random variables called the normal random variables to which the standard normal random variable belongs. Given any expected value and any positive variance, this family contains a random variable with this particular expected value and variance. The family has many other useful properties which we will look at later on in the course.
-
24Several random variablesVideo lesson
Given several random variables, we can define events involving some or all of the random variables and assign probabilities to these events. This will allow us to calculate probabilities involving some or all of the random variables such as the probability that X = 2 and Y = 1.
-
25The joint probabillity mass functionVideo lesson
-
26The marginal probabillity mass functionVideo lesson
-
27The joint probability density functionVideo lesson
Distribution functions for several random variables
With two random variables, the probability mass function, which was a function of one variable, will be replaced with the joint probability mass function, a function of two variables. The probability density function will similarly be replaced with the joint probability density function. Joint distribution functions can be used to calculate probabilities involving both of the random variables. In many cases, we are given a joint distribution function but we want to calculate the probability involving only one of the random variables in this case, we must find the marginal PDF/PMF which is then a function of only one variable. Finally, we will define what we mean by two random variables being independent.
-
28The marginal probability density functionVideo lesson
-
29Independent random variablesVideo lesson
-
30Conditional probability mass functionVideo lesson
Conditional distribution functions
Early in the course we talked about conditional probabilities. With two random variables I can define the conditional probability mass function as the probability that the first random variable takes a particular value given that the second random variable takes another given value. For continuous random variables, we will define the conditional probability density function or the conditional PDF. In a problem we will illustrate that if the random variables are independent then the conditional PDF/PMF will be equal to the marginal.
-
31Conditional probability density functionVideo lesson
-
32Function of several random variablesVideo lesson
Function of several random variables and its moments
In chapter 3 we talked about the importance of being able to define new random variable as a function of an existing random variable. If I have two random variables, I will be able to define 1/3 random variable as a function of my two existing ones. More generally, I can create a new random variable as a function of an arbitrary number of existing random variables. As was the case when we talked about a function of a single random variable, finding the moments of a random variable defined as a function of existing ones is generally a simpler problem than finding its PDF/PMF, particularly if the function is linear.
-
33Expected value of a function of several random variablesVideo lesson
-
34Linear function of several random variablesVideo lesson
-
35Covariance, correlation and independenceVideo lesson
Covariance and correlation
With that two or more random variables, I can define what is called joint moments. These are moments that the random variables carry as a group and they can be calculated from the joint PDF/PMF. The most important joint moments of two random variables is the covariance. We will begin by introducing covariance, correlation and independence from a nontechnical point of view. We will then look at the expected value of the product of two random variables which will then help us understand the definition of covariance. We will end this section by looking at some important results related to the covariance between two random variables.
-
36The expected value of the product of two random variablesVideo lesson
-
37CovarianceVideo lesson
-
38Covariance, resultsVideo lesson
-
39Conditional expectations, discrete random variablesVideo lesson
Conditional expectation
We have talked about the expected value of a random variable and we have talked about the conditional PDF/PMF. Given to random variables, we can define the conditional expectation of one of them given the other one. Intuitively, the conditional expectation of Y given X is the value that I expect for Y if I know the value that the random variable X took. The conditional expectation can be calculated from the conditional PDF/PMF.
-
40Conditional expectations, continuous random variablesVideo lesson
-
41Law of iterated expectationsVideo lesson
Law of iterated expectation and conditional moments
We begin this section with the important law of iterated expectations which connects the regular unconditional expected value to the conditional expected value. Just like it is possible to find the expected value of the function of a random variable, we can find the conditional expected value of a function of a random variable by evaluating a sample or an integral. Finally, we define the conditional variance of a random variable Y given another random variable X.
-
42Conditional expectation of a function of a random variableVideo lesson
-
43Conditional varianceVideo lesson
-
44Random vectorsVideo lesson
Random vectors and functions of random vectors
The first section is basically a generalization of much of chapter four using vector notation. It generalizes the various distribution functions and functions of random variables using vector notation.
-
45Function of a random vectorVideo lesson
-
46The expected value of a random vectorVideo lesson
Moments of random vectors
In this section we will begin by defining the expected value of a random vector. Basically, it will be a new vector containing the expected value of each random variable making up our rental vector. We then move onto the definition of the variance of a random vector which will turn out to be a matrix. We end this section by looking at the conditional expectations of a random vector.
-
47The variance of a random vectorVideo lesson
-
48Conditional expectations and random vectorsVideo lesson

External Links May Contain Affiliate Links read more