CMSC 27100 — Lecture 5b

The notes for this course began from a series originally written by Tim Ng, with extensions by David Cash and Robert Rand. I have modified them to follow our course.

Basic Counting

This section marks the start of the real body of Unit 2: Counting and Probability. This week, our lecture focuses on counting. We'll talk about how that relates to probability, and how to connect the two, starting next lecture.

I'm hoping that you, coming into this course, have already learned how to count at some point in your life (probably, in fact, before you even remember). If you can't tell me what the first number is, or what the second number is, or how to go from a number to the next one, this lecture will be a bit of a challenge. For better or for worse, it turns out that there aren't that many ways to count, and as a result there aren't that many things that we know how to count. At least, not directly.

In this unit, when we're talking about counting, what we're really focused on is ascertaining the sizes of certain well-defined but not necessarily easily enumerable sets. Suppose I have a bin with $10$ balls, labeled $1$ to $10$. If I pull one out, how many different outcomes are possible?

Of course, the answer is $10$, but let's walk through it mathematically: we can denote by $S$ the set of outcomes, so $$S = \{\text{Labels when pulling a ball out of a bin of $10$ balls labeled } 1 \text{ to } 10\}.$$

Clearly, we are interested in $|S|$. Of course we already know what $|S|$ is, but if we note that $$S = \{1, 2, 3, 4, 5, 6, 7, 8, 9, 10\},$$ then suddenly it's even more clear that $S$ has $10$ objects, so $|S| = 10$.

This may seem a bit silly, but I want you to think about how you count things in the real world. Say there's a set of balls in a bin, and you'd like to count how many there are. When we think about the process of counting, it tends to look like keeping a running tally of numbers in your head, and incrementing by $1$ for each new ball. You may, for example, take balls out of the bin one at a time, increasing the count in your head for each ball. By the end, if the last ball is out of the bin and your running total is at $10$, you know there are ten balls. Really, then, what you've done is you've enumerated the balls, assigning each one a number, so that by the end of the process the largest number is the size of the set. In that way, counting things in the real world is no different from what we've just done here! At its core, we can think of basic counting as enumerating objects in a set, and pulling out the maximal number.

Let's try a slightly more complex set, like the following: $$S = \{11, 14, 17, 20, 23, ..., 74\}.$$ At first glance, it isn't obvious how you'd count it. Instead, let's make a couple of transformations to each element in the set. First, We can subtract each element by $8$: $$S' = \{3, 6, 9, 12, 15, ..., 66\}.$$ Then, we divide each element by $3$: $$S'' = \{1, 2, 3, 4, 5, ..., 22\}.$$ Because neither of our operations changed the size of the set, and we know $|S''| = 22$, it's clear that $|S|=22$ as well.

Of course, you may have a different way to count the previous set as well! For example, you may subtract $74-11 = 63$, and know that each number is $3$ away from the previous one, which tells you that $74$ is $21$ steps away from $11$, for a total of $22$ numbers. In fact, that's the point of this problem, and the overarching skill we're trying to develop in this unit. We really only have very few ways to count things, so this unit is about recognizing how to change the way we view the problem to make it easier for us to digest. The set $S$ and the set $S''$ are of course of the same size, but $S''$ is much, much easier for us to count.

Counting Rules

There are a couple of rules we leverage heavily when counting to make it easier for us to deal with larger and more complex sets. Both of these rules should be pretty intuitive, but they're good to be aware of and be confident that they are correct.

Suppose you are ordering an ice cream cone. There are three flavors of ice cream (chocolate, vanilla, strawberry), and two types of cones (cake and waffle). How many different orders are possible?

Intuitively, it's $3\cdot 2 = 6$.

To see this in an organized way, you can draw a tree, like in Figure 1.4 on page 10 of [BH].

If $A_1, A_2, \dots, A_k$ are finite sets, then $$|A_1 \times A_2 \times \cdots \times A_k| = |A_1| \cdot |A_2| \cdots |A_k|.$$

Suppose I have a bin with balls labeled $1$ to $10$, and another with balls labeled $11$ to $15$. How many outcomes are possible if I draw a ball out of each bin?

The set of outcomes for bin $1$ is $B_1 = \{1, 2, ..., 10\}$, and the set of outcomes for bin $2$ is $B_2 = \{11, 12, ..., 15\}$, so the set of total outcomes is $B_1 \times B_2$. By the multiplication rule, we learn that $|B_1 \times B_2| = |B_1| \cdot |B_2| = 10 \cdot 5 = \boxed{50}$.

The multiplication rule is used when we want to count the ways that multiple events occur. We can think of it as corresponding to the AND connective. For example, from Example 5.4, we found the number of ways that a ball from $B_1$ is drawn AND a ball from $B_2$ is drawn. If, instead, we care about the number of outcomes where either a ball from $B_1$ is drawn OR a ball from $B_2$ is drawn, we can use the addition rule. The rule formalizes what is very intuitive: You can just count the sets in the union, and then add up their sizes.

If $A_1, A_2, \dots, A_k$ are disjoint finite sets, then $$|A_1 \cup A_2 \cup \cdots \cup A_k| = |A_1| + |A_2| + |A_k|.$$

Suppose I have a bin with balls labeled $1$ to $10$, and another with balls labeled $11$ to $15$. How many outcomes are possible if I draw a ball out of one of the bins?

The set of outcomes for bin $1$ is $B_1 = \{1, 2, ..., 10\}$, and the set of outcomes for bin $2$ is $B_2 = \{11, 12, ..., 15\}$, so the set of total outcomes is $B_1 \cup B_2$. By the addition rule, we learn that $|B_1 \cup B_2| = |B_1| + |B_2| = 10 + 5 = \boxed{15}$.

Counting with Decision Processes

The multiplication rule can be generalized to solve harder problems. It sometimes takes a little (or a lot) of cleverness, but one can count many sets by constructing a decision process for selecting an element of the set. A decision process is nothing more than a sequence of decisions that leads to a particular outcome, analogous to ordering a drink, and then a main course, and then a dessert at a restaurant. If the decision process stages have $d_1,d_2,\ldots,d_k$ options at each level then the tree has $d_1\cdot d_2\cdots d_k$ leaves, and hence so does the set we're trying to count.

To be bit a more precise, we state the principle as the following theorem.

Suppose a decision process with $k$ steps produces a tree with the following properties:

  1. At each level $i$, all nodes have the same number $d_i$ of branches leading down to the next level.
  2. Each outcome appeared on exactly one leaf.
Then the total number of outcomes is $d_1\cdot d_2 \cdots d_{k}$.

We'll solve a lot of problems using this approach, and also give examples of how it can go wrong.

We start with some examples.

How many two-digit strings are there whose digits are different?

We create a decision process for this as follows:

  1. Select the first digit.
  2. Select the second digit to be different from the first.
Note that we have $10$ options for the first decision, and $9$ options for the second decision. That leaves us with $10 \cdot 9 = \boxed{90}$ options overall.

How many two-digit numbers are there whose digits are different?

We create a decision process for this as follows:

  1. Select the first digit.
  2. Select the second digit to be different from the first.
This time, note that the first digit can't be $0$ because it's a two-digit number, so there are $9$ possibilities for the first digit. The second digit can be any except for the one picked in the first decision. Thus, $d_1$ has $9$ possibilities, and $d_2$ has $9$ possibilities, for a total of $d_1 \cdot d_2 = \boxed{81}$.

How many ways are there to arrange $6$ people into a line?

We create a decision process for this as follows:

  1. Select the person to be first in line.
  2. Select the person to be second in line.
  3. ...
  4. Select the person to be sicth in line.
The first decision has $6$ possibilities, the second has $6-1=5$ (because the person who is first can not also be second), and so on, so the total number of possibilities is $6 \cdot 5 \cdot 4 \cdot ... \cdot 1$.
The product of descending integers comes up so frequently in counting that we have notation for it:

The decreasing product of positive integers from some $n$ down to $1$ is referred to as $n$ factorial or $n!$, with the factorial by the exclamation point. For example, $$5! = 5 \cdot 4 \cdot 3 \cdot 2 \cdot 1.$$ For notational reasons, it is also convenient to define $0!=1$.

Prove that $|\mathcal{P}(S)| = 2^{|S|}$.

If we denote $S=\{s_1, s_2, ..., s_n\}$, we note that the power set consists of every possible subset of $S$. To construct any subset of $S$, we can use the following decision process:

Then we have $n$ total decisions to make, each of which has two possible choices, for a total of $2^{|S|}$ possibilities.

Counting Subsets of a Given Size

Let's try a new one. Say I have my same bin of $10$ labeled balls, but I want to pull out three at once. In other words, I don't really care about the order of the three balls I'm getting - I just want to know which three I got.

Using what we've learned so far, we might try to make another decision process. We could try to select the first ball ($10$ options) and then select the second ball ($9$ options), and finally select the third ball ($8$ options). This may seem tempting, but it's actually incorrect!

Let's take a closer look at what happened: because we only care about the balls we drew and not the order we drew them in, this decision process actually counts the same outcome multiple times! For example, in one run I may draw $4$, then $7$, then $1$, but in a different run I may draw $1$, then $4$, then $7$. These are the same outcome if I don't care about the order I drew them in, but the decision process doesn't account for that! In fact, for each $3$-ball group, we can use a decision process to find that there are $3 \cdot 2 \cdot 1$ different orderings in which we can get the same outcome. Knowing that, we can simply divide the number of outcomes we have found through our decision process by $6$, giving us a final answer of $(10 \cdot 9 \cdot 8) /(3 \cdot 2 \cdot 1)$.

It is often the case that we are interested in the number of subsets of size $k$ exist in a set of size $n$. So often, in fact, that we will introduce special notation for it:

Let $n,k$ be integers with $n\geq 0$. We define the notation $\binom{n}{k}$ (pronounced "$n$ choose $k$") to be the number of subsets of size $k$ contained in a subset of size $n$. More formally, $$ \binom{n}{k} = \left|\{ S \subseteq \{1,\ldots,n\} \ : \ |S| = k\}\right|. $$ These numbers are called binomial coefficients.

Binomial coefficients are useful in a great many ways in combinatorics and other areas of math. Observe that if $k$ is negative or greater than $n$, then $\binom{n}{k}=0$ since there are never sets of those sizes. When $k$ is between $0$ and $n$, we have the following formula that generalizes the pattern above:

Let $n,k$ be integers with $n\geq 0$ and $0 \leq k \leq n$. Then $$ \binom{n}{k} = \frac{n!}{(n-k)!\cdot k!}. $$ (We use the convention $0!=1$ in the notation above.)

This will be our first example of a combinatorial proof, which is a perfectly rigorous way to establish formulas like this, but it might at first feel loose and fast compared to more concrete proofs. (Indeed, you can prove this via some sort of induction if you prefer.)

The number $n!/(n-k)!$ counts the ways to pick $k$ elements from $\{1,\ldots,n\}$ with order. Amongst these, each subset of size $k$ will have its elements listed exactly $k!$ times. Therefore $$ k!\cdot \binom{n}{k} = \frac{n!}{(n-k)!}. $$ It's worth noticing that this argument works when $n=0$ and/or when $k=0$ or $k=n$.

We have actually proved something non-obvious: For all $n$ and $k$ in theorem, $n!$ is divisible by $(n-k)!\cdot k!$. A priori it was not be clear why this should be the case.

Suppose we want to pick the top three of Taylor Swift's 274 songs (as of July 2025, at least). If we're not ordering those songs as 1st/2nd/3rd, then there are $$\binom{274}{3} = \frac{274!}{(274-3)!3!} = \frac{274\cdot 273\cdot 272}{3\cdot 2 \cdot 1} = 3391024 $$ ways to make this choice. Factorials and binomial coefficients get big very quickly.

The choose function does not have to be used on its own either - it can often show up as one step of a decision process.

How many "words" can we make from the letters in LALALALA? It has $8$ letters, but the answer is not $8!$ since some of those arrangements give the same word (e.g. swapping the A's around won't change the word).

We can create a decision process for this problem as follows: Begin with eight blanks, and then

  1. Select four positions and put L's in them.
  2. Put A's in the remaining positions.

There are $d_1 = \binom{8}{4}$ ways to the make the first choice, and $d_2 = 1$ way to make the second choice. The answer is thus $d_1\cdot d_2 = \binom{8}{4}$.

We could have selected positions from the A's first without changing our answer.

Let's do the same for the word BANANA. The decision process begins with $6$ blanks, and is:

  1. Select three positions and put A's in them.
  2. Of the remaining three positions, select two and put N's in them.
  3. Put a B in the remaining position.
We have $d_1 = \binom{6}{3}$, $d_2 = \binom{3}{2}$, and $d_3 = 1$. The answer is $\binom{6}{3}\binom{3}{2}$. Once again, we can change the order of the decisions and get the same number.