More info: Math videos. Sign up for the free IntMath Newsletter. Get math study tips, information, news and updates each fortnight. Join thousands of satisfied students, teachers and parents! Factorial Notation 2. Basic Principles of Counting 3. Permutations 4. Combinations 5. Introduction to Probability Theory 6. Conditional Probability 8. Independent and Dependent Events 9. Mutually Exclusive Events Probability Distributions - Concepts Binomial Probability Distributions Poisson Probability Distribution Introduction to Probability Theory.

Probability of an Event. Probability And the probability of this happening is The probability of a compound event described by the word "and" is the product of the simple events if the simple events are independent. To be independent two events cannot possibly influence each other. For example, as long as one is willing to assume that the events of the quarterback remaining healthy and the linemen all passing are independent, then the probability of winning the conference football championship can be calculated by multiplying the probabilities of each of the separate events together.

For example, if the probability of the quarterback remaining healthy is. This relationship can be written in symbols as follows:. If the compound event can be described by two or more events joined by the word "or", then the probability of the compound event is the sum of the probabilities of the individual events minus the probability of the joint event.

For example, the probability of all the linemen passing would be the sum of the probability of all studying very hard plus the probability of all being very lucky, minus the probability of all studying very hard and all being very lucky.

- 5. Introduction to Probability Theory.
- Our Best Logic Puzzles.
- Remembering Oz.
- Liverpool.
- Curiosity Thrilled the Cat (A Magical Cats Mystery Book 1).
- Elementary Probability Theory.
- Probability Theory: An Introduction.

For example, suppose that the probability of all studying very hard was. The probability of all passing would be. In general the relationship can be written as follows:. A conditional probability is the probability of an event given another event is true. The probability that the quarterback will remain healthy given that he stretches properly at practice and before game time would be a conditional probability.

By definition a conditional probability is the probability of the joint event divided by the probability of the conditional event. In the previous example, the probability that the quarterback will remain healthy given that he stretches properly at practice and before game time would be the probability of the quarterback both remaining healthy and stretching properly divided by the probability of stretching properly.

Suppose the probability of stretching properly is.

The conditional probability of remaining healthy given that he stretched properly would be. The "given" is written in probability theory as a vertical line , such that the preceding could be written as:. Conditional probabilities can be combined into a very useful formula called Bayes's Rule. This equation describes how to modify a probability given information in the form of conditional probabilities. The equation is presented in the following:. Suppose that an instructor randomly picks a student from a class where males outnumber females two to one.

## Subscribe to RSS

What is the probability that the selected student is a female? This probability is called the prior probability and would be represented in the above equation as P A. Suppose additional information was provided about the selected student, that the shoe size of the person selected was 7. Often it is possible to compute the conditional probability of B given A or in this case, the probability of a size 7.

### Finite Probability Spaces

In a like manner, the probability of B given not A can often be calculated; in this case the probability of a size 7. Suppose the former probability is. The likelihood of the person being a female given a shoe size of 7. The value of P A B is called a posterior probability and in this case the probability of the student being a female given a shoe size of 7. The ability to recompute probabilities based on data is the foundation of a branch of statistics called Bayesian Statistics. This set of rules barely scratches the surface when considering the possibilities of probability models.

The interested reader is pointed to any number of more thorough treatments of the topic. Including cost as a factor in the equation can extend the usefulness of probabilities as an aid in decision-making.

## Basic Concepts of Probability Theory - Wiley Telecom books

This is the case in a branch of statistics called utility theory that includes a concept called utility in the equation. Utility is the gain or loss experienced by a player depending upon the outcome of the game and can be symbolized with a "U". Usually utility is expressed in monetary units, although there is no requirement that it must be. The symbol U A would be the utility of outcome A to the player of the game. A concept called expected utility would be the result of playing the game an infinite number of times.

In its simplest form, expected utility is a sum of the products of probabilities and utilities:. Suppose someone was offered a chance to play a game with two dice. Should the player consider the game? Using expected utility analysis, the expected utility would be:. Since the expected utility is less than 0, indicating a loss over the long run, expected utility theory would argue against playing play the game. Again this illustration just barely scratches the surface of a very complex and interesting area of study and the reader is directed to other sources for further study.

In particular, the area of game theory holds a great deal of promise. Your should be aware that the preceding analysis of whether on not to play a given game based on expected utility assumes that the dice are "fair", that is, each face is equally likely. To the extent the fairness assumption is incorrect, for example using weighted dice, then the theoretical analysis will also be incorrect.

Going back to the original definition of probabilities, that of a relative frequency given an infinite number of possibilities, it is never possible to "know" the probability of any event exactly. Does this mean that all the preceding is useless?

- 5. Introduction to Probability Theory.
- Related articles:.
- An Intuitive Introduction to Probability.
- Death Du Jour: (Temperance Brennan 2).

Absolutely not! It does mean, however, that probability theory and probability models must be viewed within the larger framework of model-building in science. The "laws" of probability are a formal language model of the world that, like algebra and numbers, exist as symbols and relationships between symbols. They have no meaning in and of themselves and belong in the circled portion of the model-building paradigm.

As with numbers and algebraic operators, the symbols within the language must be given meaning before the models become useful. In this case "interpretation" implies that numbers are assigned to probabilities based on rules. The circled part of the following figure illustrates the portion of the model-building process that now becomes critical. There are a number of different ways to estimate probabilities. Each has advantages and disadvantages and some have proven more useful than others.

Just because a number can be assigned to a given probability symbol, however, does not mean that the number is the "true" probability. When there is no reason to believe that any outcome is more or less likely than any other outcome, then the solution is to assign all outcomes an equal probability. For example, since there is no reason to believe that heads is more likely than a tails a value of. Note that this system does not work when there is reason to believe that one outcome is more likely than another.

For example, setting a probability of. There are two alternatives, it will either be snowing or it won't, but equal probabilities are not tenable because it is sunny and 60 degrees outside my office right now and I have reason to believe that it will not be snowing in an hour. The relative frequency of an event in the past can be used as an estimate of its probability.

For example, the probability of a student succeeding in a given graduate program could be calculated by dividing the number of students actually finishing the program by the number of students admitted in the past. Establishing probabilities in this fashion assumes that conditions in the past will continue into the future, generally a fairly safe bet. The greater the number of observations, the more stable the estimate based on relative frequency. For example, the probability of a heads for a given coin could be calculated by dividing the number of heads by the number of tosses.

An estimate based on 10, tosses would be much better than one based on 10 tosses. The probability of snow outside in a hour could be calculated by dividing the number of times in the past that it has snowed when the temperature an hour before was 60 degrees by the number of times it has been 60 degrees. Since I don't have accurate records of such events, I would have to rely on memory to estimate the relative frequency. Since memory seems to work better for outstanding events, I am more likely to remember the few times it did snow in contrast to the many times it did not.

The problems with using relative frequency were discussed in some detail in Chapter 5, "Frequency Distributions. The problem is that unless a very large sample of women's shoe sizes is taken, the relative frequency of any one shoe size is unstable and inaccurate.