3.1 Asymptotic notation
3.1-1
Let $f(n) + g(n)$ be asymptotically nonnegative functions. Using the basic definition of $\Theta$-notation, prove that $\max(f(n), g(n)) = \Theta(f(n) + g(n))$.
First, let's clarify what the function $\max(f(n), g(n))$ is. Let's define the function $h(n) = \max(f(n), g(n))$. Then
$$ h(n) = \begin{cases} f(n) & \text{ if } f(n) \ge g(n), \\ g(n) & \text{ if } f(n) < g(n). \end{cases} $$
Since $f(n)$ and $g(n)$ are asymptotically nonnegative, there exists $n_0$ such that $f(n) \ge 0$ and $g(n) \ge 0$ for all $n \ge n_0$. Thus for $n \ge n_0$ , $f(n) + g(n) \ge f(n) \ge 0$ and $f(n) + g(n) \ge g(n) \ge 0$. Since for any particular $n$, $h(n)$ is either $f(n)$ or $g(n)$, we have $f(n) + g(n) \ge h(n) \ge 0$, which shows that
$$h(n) = \max(f(n), g(n)) \le c_2(f(n) + g(n))$$
for all $n \ge n_0$ (with $c_2 = 1$ in the definition of $\Theta$).
Similarly, since for any particular $n$, $h(n)$ is the larger of $f(n)$ and $g(n)$, we have for all $n \ge n_0$, $0 \le f(n) \le h(n)$ and $0 \le g(n) \le h(n)$. Adding these two inequalities yields $0 \le f(n) + g(n) \le 2h(n)$, or equivalently $0 \le (f(n) + g(n)) / 2 \le h(n)$, which shows that
$$h(n) = \max(f(n), g(n)) \ge c_1(f(n) + g(n))$$
for all $n \ge n_0$ (with $c_1 = 1 / 2$ in the definition of $\Theta$).
3.1-2
Show that for any real constants $a$ and $b$, where $b > 0$,
$$(n + a)^b = \Theta(n^b). \tag{3.2}$$
To show that $(n + a)^b = \Theta(n^b)$, we want to find constants $c_1, c_2, n_0 > 0$ such that $0 \le c_1 n^b \le (n + a)^b \le c_2 n^b$ for all $n \ge n_0$.
Note that
$$ \begin{aligned} n + a & \le n + |a| & \\ & \le 2n & \text{ when } |a| \le n, \end{aligned} $$
and
$$ \begin{aligned} n + a & \ge n - |a| & \\ & \ge \frac{1}{2}n & \text{ when } |a| \le \frac{1}{2}n. \end{aligned} $$
Thus, when $n \ge 2|a|$,
$$0 \le \frac{1}{2}n \le n + a \le 2n.$$
Since $b > 0$, the inequality still holds when all parts are raised to the power $b$:
$$ \begin{aligned} 0 \le \Big(\frac{1}{2}n\Big)^b & \le (n + a)^b \le (2n)^b, \\ 0 \le \Big(\frac{1}{2}\Big)^b n^b & \le (n + a)^b \le 2^b n^b. \end{aligned} $$
Thus, $c_1 = (1 / 2)^b$, $c_2 = 2^b$, and $n_0 = 2|a|$ satisfy the definition.
3.1-3
Explain why the statement, "The running time of algorithm $A$ is at least $O(n^2)$," is meaningless.
Let the running time be $T(n)$. $T(n) \ge O(n^2)$ means that $T(n) \ge f(n)$ for some function $f(n)$ in the set $O(n^2)$. This statement holds for any running time $T(n)$, since the function $g(n) = 0$ for all $n$ is in $O(n^2)$, and running times are always nonnegative. Thus, the statement tells us nothing about the running time.
3.1-4
Is $2^{n + 1} = O(2^n)$? Is $2^{2n} = O(2^n)$?
$2^{n + 1} = O(2^n)$, but $2^{2n} \ne O(2^n)$.
-
To show that $2^{n + 1} = O(2^n)$, we must find constants $c$; $n_0 > 0$ such that
$$0 \le 2^{n + 1} \le c \cdot 2^n \text{ for all } n \ge n_0.$$
Since $2^{n + 1} = 2 \cdot 2^n$ for all $n$, we can satisfy the definition with $c = 2$ and $n_0 = 1$.
-
To show that $2^{2n} \ne O(2^n)$, assume there exist constants $c, n_0 > 0$ such that
$$0 \le 2^{2n} \le c \cdot 2^n \text{ for all } n \ge n_0.$$
Then $2^{2n} = 2^n \cdot 2^n \le c \cdot 2^n \Rightarrow 2^n \le c$. But no constant is greater than all $2^n$, and so the assumption leads to a contradiction.
3.1-5
Prove Theorem 3.1.
The theorem states:
For any two functions $f(n)$ and $g(n)$, we have $f(n) = \Theta(g(n))$ if and only if $f(n) = O(g(n))$ and $f(n) = \Omega(g(n))$.
From $f = \Theta(g(n))$, we have that
$$0 \le c_1 g(n) \le f(n) \le c_2g(n) \text{ for } n > n_0.$$
We can pick the constants from here and use them in the definitions of $O$ and $\Omega$ to show that both hold.
From $f(n) = \Omega(g(n))$ and $f(n) = O(g(n))$, we have that
$$ \begin{aligned} & 0 \le c_3g(n) \le f(n) & \text{ for all } n \ge n_1 \\ \text{and } & 0 \le f(n) \le c_4g(n) & \text{ for all } n \ge n_2. \end{aligned} $$
If we let $n_3 = \max(n_1, n_2)$ and merge the inequalities, we get
$$0 \le c_3g(n) \le f(n) \le c_4g(n) \text{ for all } n > n_3.$$
Which is the definition of $\Theta$.
3.1-6
Prove that the running time of an algorithm is $\Theta(g(n))$ if and only if its worst-case running time is $O(g(n))$ and its best-case running time is $\Omega(g(n))$.
If $T_w$ is the worst-case running time and $T_b$ is the best-case running time, we know that
$$ \begin{aligned} & 0 \le c_1g(n) \le T_b(n) & \text{ for } n > n_b \\ \text{and } & 0 \le T_w(n) \le c_2g(n) & \text{ for } n > n_w. \end{aligned} $$
Combining them we get
$$0 \le c_1g(n) \le T_b(n) \le T_w(n) \le c_2g(n) \text{ for } n > \max(n_b, n_w).$$
Since the running time is bound between $T_b$ and $T_w$ and the above is the definition of the $\Theta$-notation, proved.
3.1-7
Prove $o(g(n)) \cap w(g(n))$ is the empty set.
We know that for any $c > 0$,
$$ \begin{aligned} & \exists n_1 > 0: 0 \le f(n) < cg(n) \\ \text{and } & \exists n_2 > 0: 0 \le cg(n) < f(n). \end{aligned} $$
If we pick $n_0 = \max(n_1, n_2)$, from the problem definition we get
$$f(n) < cg(n) < f(n).$$
There is no solutions, which means that the intersection is the empty set.
3.1-8
We can extend our notation to the case of two parameters $n$ and $m$ that can go to infinity independently at different rates. For a given function $g(n, m)$ we denote $O(g(n, m))$ the set of functions:
$$ \begin{aligned} O(g(n, m)) = \{f(n, m): & \text{ there exist positive constants } c, n_0, \text{ and } m_0 \\ & \text{ such that } 0 \le f(n, m) \le cg(n, m) \\ & \text{ for all } n \ge n_0 \text{ or } m \ge m_0.\} \end{aligned} $$
Give corresponding definitions for $\Omega(g(n, m))$ and $\Theta(g(n, m))$.
$$ \begin{aligned} \Omega(g(n, m)) = \{f(n, m): & \text{ there exist positive constants } c, n_0, \text{ and } m_0 \\ & \text{ such that } 0 \le cg(n, m) \le f(n, m) \\ & \text{ for all } n \ge n_0 \text{ or } m \ge m_0.\} \end{aligned} $$
$$ \begin{aligned} \Theta(g(n, m)) = \{f(n, m): & \text{ there exist positive constants } c_1, c_2, n_0, \text{ and } m_0 \\ & \text{ such that } 0 \le c_1 g(n, m) \le f(n, m) \le c_2 g(n, m) \\ & \text{ for all } n \ge n_0 \text{ or } m \ge m_0.\} \end{aligned} $$