程序代写代做代考 CS570 Biomedical Science & Health IT

CS570 Biomedical Science & Health IT

CS544 D1
Foundations of Analytics
Lecture 7

Guanglan Zhang
1

1

Discrete Distributions

A probability distribution for a discrete random variable, X, is defined in terms of a list of all possible numerical outcomes along with the probability of each outcome.

The list of all outcomes is known as the support of the random variable, SX.

The probability mass function (PMF) of a discrete random variable X is denoted by:
, and defined as

The mean or the expected value of the probability distribution of the discrete random variable X is:

The variance of the discrete distribution is:
The alternative formula is

The cumulative distribution function (CDF) of the random variable X is defined as:

2

2

Discrete Uniform Distributions

A discrete uniform distribution is a symmetric probability distribution where a finite number of input values are equally likely.

If a discrete random variable X has m input values 1,2,…,m, then X has the discrete uniform distribution when , for all values of x from 1 to m.

The probability mass function (PMF) of a random variable X with discrete uniform distribution is:

The cumulative distribution function (CDF) of X, for a given value of x is the probability . For uniform distribution:

3

3

Discrete Uniform Distributions

The mean of the discrete uniform distribution is:
= = =

The variance of the discrete uniform distribution is:
= (sum of sequence of squares )

4

4

Binomial Coefficients

The binomial coefficients is the number of ways of picking x unordered outcomes from n possibilities, also known as a combination or combinatorial number.

The binomial coefficient , or C(n, x) or , (read as “n choose x”), is defined as:

where n is a positive integer, and x is a nonnegative integer less than or equal to n.
For a positive integer n, the binomial theorem gives

5

5

Bernoulli trials

The Bernoulli trials, named after mathematician Jacob Bernoulli, is one of the simplest yet most important random processes in probability.

A Bernoulli trial is a random experiment in which there are only two possible outcomes—success (S) and failure (F). Repeated trials of an experiment are called Bernoulli trials.

A sequence of Bernoulli trials satisfies the following assumptions:
Each trial has two possible outcomes, in the language of reliability called success and failure.
The trials are independent. Intuitively, the outcome of one trial has no influence over the outcome of another trial.
On each trial, the probability of success is p and the probability of failure is 1-p where is the success parameter of the process.

http://www.math.uah.edu/stat/bernoulli/Introduction.html

6

6

Bernoulli trials

The random variable X in a Bernoulli trial is defined as follows:
X=1, if the outcome is a success (S), and X=0, if the outcome is a failure (F).

Let the probability of success be p. Then the probability of failure is (1–p).

The probability mass function of X is: , for x=0 or 1

The mean is:

The variance is: –

7

7

Binomial Distribution

Let = 1 if the ith Bernoulli trial is successful, 0 otherwise.

If X=, where the s are independent and identically distributed (iid), then X has a binomial distribution. The binomial random variable X counts the number of successes in n Bernoulli trials.

The distribution of X is described by the two parameters, n (the number of trials) and p (the success probability).

P(Event) = (Number of ways event can occur) * P(One occurrence).

The probability mass function, , of the binomial random variable X is:

The sum of all the probability values in the distribution is:

8

8

Binomial Distribution

The mean of the binomial distribution is:

The variance of the binomial distribution is:

9

9

/docProps/thumbnail.jpeg