# Math 30530 - Introduction to Probability

## Fall 2013

### Instructor: David Galvin

General course arrangements are detailed here.

Supplemental course material:

Slides on Limit laws (weak law of large numbers, central limit theorem) (December 6)

Michael Huber and Andrew Glen's paper justifying the use of the exponential random variable to model the waiting time to the next occurrence of a rare event in baseball (November 4)

Slides on miscellaneous discrete random variables (September 27)

Slides on wars and the Poisson random variable (September 25)

Slides on Discrete random variables (September 18)

Slides on counting (September 13)

Slides on independence (September 11)

Slides on Bayes formula (September 9)

Slides on conditional probability examples (September 6)

Slides on The paradox of the two children (September 4)

Slides on continuous uniform probability model (September 2)

Slides on discrete models of probability (August 30)

Slides on basics of probability (August 28)

If you haven't yet got a copy of the textbook, you can print off the first chapter here

Homework:

Assignment 11, not to be turned in. Here are some questions from Chapter 5, suitable for preparing for the final: Problems 5.1 (a and b only), 5.4, 5.8, 5.9, 5.10 and 5.11

Assignment 10 due Monday, December 9. Solutions are here; here are supplementary files 1 and 2.

Assignment 9, not to be turned in (practice for second midterm). Solutions are here; here is the supplementary file.

Assignment 8 due Friday, November 15. Solutions are here; here is the supplementary file.

Assignment 7 due *Wednesday*, November 6. Solutions are here; here are supplementary files 1 and 2.

Assignment 6 due Friday, October 11. Here are solutions.

Assignment 5 due Monday, October 7. [CORRECTED VERSION: typo in problem 11 fixed.] Here are solutions, and here is the supplementary solutions file.

Assignment 4 due Friday, September 27. Here are solutions, and here is the supplementary solutions file.

Assignment 3 due Friday, September 20. Here are solutions, and here is the supplementary solutions file.

Assignment 2 due Friday, September 13. Here are solutions.

Assignment 1 due Friday, September 6. Here are solutions.

Quizzes:

Quiz 5 (December 9).

Quiz 4 (November 4).

Quiz 3 (October 9).

Quiz 2 (September 25).

Quiz 1 (September 11).

Exams:

Exam 2: Here is the second midterm, with solutions. Details: in class, Friday November 22, closed book, covering the following sections:

• 2.7 (Independence of discrete random variables)
• 3.1 (Density functions of continuous random variables, including expactation, variance and expectation of a function)
• 3.2 (The cumulative distribution function of a random variable)
• 3.3 (The normal random vaiable)
• 3.4 (Joint density of two continuous random variables)
• 3.5 (*only* example 3.13 [memorylessness of exponential] and the final subsection, starting page 175, on "Independence")
• 4.1 (Finding the density of a function of a continuous random variable)
There is a lot of material, and a lot of formulae, in Section 4.1. I don't expect you to know these formulae. What I expect is that you can compute the density of a function g(X) (or g(X,Y)) of a known randon variable X (or rvs X, Y), by following these steps:
1. figure out the range of possible values of g(X) (or g(X,Y)) (lowest possicle value and highest possible value)
2. for each possible value y of g(X), figure out how to express P(g(X) \leq y) in terms of the form P(X?)
3. use the density function of X to solve for these probabilities, and so calculate to CDF of g(X) (or g(X,Y))
4. calculate the density of g(X) of g(X,Y) by differentiating the CDF.
Here are the basic families of continuous random variables that you should know about. For each, you should know the situation when it is appropriate to use it, the parameters it depends on, what the density function is, and what the expectation and variance are.
• Uniform
• Exponential
• Normal (standard and general; you should know how to read a standard normal table)
Here is a practice exam. Solutions are here.
Because alot is going on this week, some of my office hours will have to move. Here are the new hours:
• Wednesday, 3-5, DeBartolo 136
• Friday, 10-12.15, DeBartolo 209

Exam 1: Here is the first midterm. Details: in class, Monday October 14, closed book, covering the following sections:

• 1.1 (Sets and set operations)
• 1.2 (probability models and the rules/laws/axioms of probability [NOT the section "Models and Reality"])
• 1.3 (Conditional probability)
• 1.4 (Law of total probability, and Bayes' formula)
• 1.5 (Independence of events [NOT the section "Conditional Independence"])
• 1.6 (Counting problems)
• 2.1 (Random variables, basic notions)
• 2.2 (The probability mass function, examples of basic families of random variables)
• 2.3 (Functions of random variables)
• 2.4 (Expectation, mean and variance of random variables, expectations of functions of random variables)
• 2.5 (Joint mass functions of more than one random variable)
Here are the basic families of random variables that you should know about. For each, you should know the situation when it is appropriate to use it, the parameters it depends on, and what the mass function is. For some of them, you should know its expectation and/or variance; I've indicated which ones we've done in the list below.
• Bernoulli (expectation, variance)
• Binomial (expectation)
• Poisson (expectation, variance)
• Geometric (expectation, variance)
• Hypergeometric
• Negative Binomial
Here are the extra office hours that I will have for this exam:
• Thursday: 3pm-4pm (usual office hours)
• Friday: 3.30-5pm
• Monday: 9.30-11.30
Here is a practice exam. Solutions are here.

Final: The final will be held as follows:

1.45--3.45pm, DeBartolo 217, Wednesday December 18

Note that this is not the usual room!

What is covered? The final is cumulative. It will cover:

• 1.1 (Sets and set operations)
• 1.2 (probability models and the rules/laws/axioms of probability [NOT the section "Models and Reality"])
• 1.3 (Conditional probability)
• 1.4 (Law of total probability, and Bayes' formula)
• 1.5 (Independence of events [NOT the section "Conditional Independence"])
• 1.6 (Counting problems)
• 2.1 (Random variables, basic notions)
• 2.2 (The probability mass function, examples of basic families of random variables)
• 2.3 (Functions of random variables)
• 2.4 (Expectation, mean and variance of random variables, expectations of functions of random variables)
• 2.5 (Joint mass functions of more than one random variable)
• 2.7 (Independence of discrete random variables)
• 3.1 (Density functions of continuous random variables, including expactation, variance and expectation of a function)
• 3.2 (The cumulative distribution function of a random variable)
• 3.3 (The normal random vaiable)
• 3.4 (Joint density of two continuous random variables)
• 3.5 (*only* example 3.13 [memorylessness of exponential] and the final subsection, starting page 175, on "Independence")
• 4.1 (Finding the density of a function of a continuous random variable --- see note below)
• 4.2 (Covariance and correlation coefficient of a pair of random variables)
• 4.4 (Computing the transform of moment generating function of a random variable, and using it to calculate expectatation and variance)
• 5.1 (Just Chebyshev's inequality)
• 5.2 (The weak law of large numbers)
• 5.4 (The Central Limit Theorem)

Note on Section 4.1: There is a lot of material, and a lot of formulae, in Section 4.1. I don't expect you to know these formulae. What I expect is that you can compute the density of a function g(X) (or g(X,Y)) of a known randon variable X (or rvs X, Y), by following these steps:

1. figure out the range of possible values of g(X) (or g(X,Y)) (lowest possicle value and highest possible value)
2. for each possible value y of g(X), figure out how to express P(g(X) \leq y) in terms of the form P(X?)
3. use the density function of X to solve for these probabilities, and so calculate to CDF of g(X) (or g(X,Y))
4. calculate the density of g(X) of g(X,Y) by differentiating the CDF.

Common random variables: Here are the basic families of random variables that you should know about. For each, you should know:

1. whether it is discrete or continuous
2. the situation when it is appropriate to use it
3. the parameters it depends on
4. what its mass function is (if it is discrete) and what its density function is (if it is continuous).
For many of them, you should know its expectation and/or variance and/or know how to compute its transform; I've indicated which ones we've done in the list below.
• Bernoulli (expectation, variance, transform)
• Binomial (expectation, variance, transform)
• Poisson (expectation, variance, transform)
• Geometric (expectation, variance, transform)
• Hypergeometric (expectation)
• Negative Binomial (expectation, variance)
• Continuous Uniform (expectation, variance)
• Exponential (expectation, variance, transform)
• Normal [both standard and general; and you should know how to read a standard normal table] (expectation, variance, transform)

What's the format? Similar to the practice exams below, and similar to the midterms.

Resources for preparation: Apart from reviewing old quizzes, homeworks, exams and practice exams, here are a couple of practice finals:

1. Here's the final exam from this course from Fall 2011.
2. Here's a slightly modified version of the final exam from this course from Fall 2012.
And here are my (tentative) office hours before the final (subject to change):
1. Thursday, 3-4.30pm
2. Monday, 4-5.30pm
3. Tuesday, 12-1.30pm
4. Wednesday, 11.30-1pm